Hand-held devices: Dumb vs Smart

A Doro 8080 hand-held device

I prefer to call my smallest computer a hand-held device, rather than a phone. That is because it has to perform numerous tasks, most of which are totally unrelated to making voice calls to another human bean.

One of the conversations I frequently have with myself, has to do with technological choices. Why do people opt for smart devices, rather than their slightly less intelligent siblings? Or, why are people willing to pay a premium for smart devices? An even more alluring question is, why are people willing to pay a premium for dumb devices? This can be a very real situation, especially when the dumb phone in question is a Light Phone. My standard answers have to do with apps. Someone I don’t know and don’t want to know, will demand that I install and use some specific app in order to undertake something, that could be done with a much simpler and more generic, procedure. That someone has mandated the use of a specific app. I need to use that app because I need (want, is usually the more correct term) the service or product being provided. In other words, I opt to make a bad choice, because I want a particular product or service, more than I value my privacy.

Yes, I can give specific examples. Buzz, our daily drive, is equipped with studded snow tires in the winter, the result of driving on under-plowed, but over-iced roads. Some places, such as Trondheim municipality, require vehicle users with studs to pay a stud use fee. We accept this fact. Unfortunately, installing an app to do so is not something that can be done on the road. The information needed to associate a bank account with the app is safely locked/ hidden in our house. It is not taken on road trips. Of course, we would prefer to pay with our bank card, which was possible to do through the end of 2023. However, on a drive to Trondheim during the winter of 2024, the organization collecting the fee had removed the option to pay with a bank card, and wanted to collect additional information about us. As this situation arose about 120 km away from Cliff Cottage, our only solution was to: 1) not register for the app; 2) not use an uninstalled app to pay the use fee; 3) risk having our snow tires discovered, which would result in a fine. We were not caught. Since then, we do not drive into Trondheim in the winter. This means that we buy fewer things in Trondheim. Sorry, Ikea!

Paying for electric vehicle charging is another area where it has been possible for the energy providers to demand use of an app. For years, the Norwegian electric car association has been encouraging the Norwegian government to enact legislation that will allow the use of bank cards. This has now started to happen, and all new stations have to provide this option. Established stations will have until the end of 2025 to implement changes.

This is an interesting situation because most stores in Norway are required to accept cash payments. So, if I enter the coop to buy an orange, they are required to accept a cash payment for it. If I then drive across the street to the local charging station, that station is allowed to insist that I install their app, until the end of 2025! The government that is demanding that stores allow people to pay in cash, have managed to create a situation that allows charging stations to avoid cash or card payments!

Apart from a few conversations, some messages and taking lots of photos, there is not much that I want a smart device to do. Yes, I have the capability to use a smart device, but agree with many users who complain that smart devices have become too demanding, especially with abusive social media algorithms.

There is an entire industry developing anti-smart phones. One example is the Light Phone originally launched in 2015 following a successful Kickstarter campaign. A Light Phone II arrived in 2018. They were minimalist devices, with black-and-white displays, but without a camera. In 2024, the Light Phone III emerged, with a 50 M pixel front and 8 M pixel rear = selfie camera. This resolves my main complaint, but opens several new ones. The display is still black-and-white, despite the phone capturing colour images. This means they have to be viewed in monochrome until they are downloaded. There is no headphone jack, a feature I appreciate on my smart device.

More positively, its dimensions are: 106 x 71.5 x 12 mm. Its display is 100 mm with 1080 x 1240 pixels resolution. It has a 1 800 mAh removable battery. Other tools include optional: alarm, calculator, calendar, directory, turn-by-turn directions, hotspot, music, notes/voice memo, podcast and timer features. It provides a near-field communication (NFC) chip, that allows devices to exchange small amounts of data with each other over relatively short distances. It also has 5G connectivity and fingerprint ID.

Yet, I hesitate to enter the dumb phone world. One problem is a 2 Mpixel camera. It is far from the 50 Mpixel camera I am used to. Then there is the under-powered Unisoc T107 chipset, minimal 64 MB RAM and 128 MB storage, expandable up to 32GB via the microSD card slot. Then there is 4G, not 5G.

A Light Phone III, expected to launch in 2025.

The Light Phone III is durable, constructed with a metal frame and recycled plastic panels, with an accessible and replaceable battery. The display and USB port are claimed to be more easily replaceable than those on conventional smart devices. When it launches in 2025, it is expected to cost about ten times the price of a Nokia 3210, or about 30% more than the price of my current smart device, an Asus Zenfone 9.

If I were to revert to a dumb device, I hope it would be a more stylish Nokia product, with its origins in Finland. Unfortunately, this device is targeted towards younger users, those under 16, who are likely to be banned from taking smart phones to school.

This Nokia 3210 (2024) is shown in Y2K gold. It is probably more suitable for younger users, than the elederly.

The revived Nokia 3210 was relaunched in 2024. The original Nokia 3210 was designed by Alastair Curtis (? – ) in Nokia’s Los Angeles Design Center. It was released in 1999, and became the 7th most-sold phone in history. The new phone has been restyled , appealing to dumb phone fans. Its dimensions are: 122 x 52 x 13.1 mm. It offers a 60 mm 240 x 320 pixel, in-plane switching (IPS) display, characterized as having the best colour and viewing angles, a USB-C connector for data transfer and charging, 4G connectivity, Bluetooth support, and dual-SIM capabilities. It runs Series 30+ based on Mocor OS, a proprietary operating system, and has Cloud Phone technology support. This means it is capable of accessing YouTube, and Google Sign-in Services as well as real-time modern web applications. The battery is removable, providing a Li-ion 1 450 mAh for up to several days of service. In terms of price, it is less than 15% of my current smart device.

For an old geezer/ geezette, the most likely brand for a dumbish device is probably Doro, with its origins in Malmö, in southern Sweden. It refers to itself as a consumer electronics and assistive technology company. It was founded in 1974 as a challenger to the state-run telecommuncations monopoly, and developed communications products and services for the elderly, such as mobile phones and telecare systems.

The 8080 model is produced for people with eyesight, hearing and motor skill issues. If one is having dementia issues, the 8050 model could be more appropriate. In a smart move, Doro claims that these are smartphones. Yes, anyone can claim that because there are no official definitions about what is dumb and what is smart. However, Doro also produces other devices, including those with a clam shell shape, that they refer to as dumb phones.

Doro’s interface, Eva, is described by them as: patented, intuitive and action-based. Their sales propaganda claims it is: like having someone who understands your needs always by your side. Users never need to look around for things they can’t find. Eva simply gives them a few clear choices, and then does what the user wishes based on their response. She’s also the perfect companion when starting up the phone for the first time, guiding the user every step of the way. And because she is designed by Doro, Eva makes the technology fun, available and easy for everyone, whether new to Android, or a long-time user and fan.

The Doro Quick Start Guide admits that three types of people may be setting up a phone. One can choose between “Yes, I am a beginner”, “No, I have already used one” or “I’m setting the phone up for someone else” .

The phone is 174 x 74 x 9mm in size and weighs 175g and has a 5.7 inch 1440 x720 display with a narrow bezel so will fit into a pocket, handbag or backpack. The screen is large enough to be readable. There is 32 GB of storage, which is minimal for such a device. However, the SIM drawer can accommodate a microSD memory card of up to 128GB. To open the drawer one just uses one’s fingernail. On the top edge of the phone is a standard 3.5mm headphone port while at the bottom is a microphone, a loudspeaker and USB-C power connector.

The display has very clear icons, its brilliance can be adjusted, and there is the option to change the type size. Similarly, its audio can also be optimized for moderate hearing impairment as well as offering hearing aid compatibility.

A proximity sensor turns off the touch screen when the device is held up to one’s ear. Cameras include a 5 MP front/ selfie camera. On the rear is a 15 MP camera with a flash that can also act as a flashlight, as well as a second microphone. There is also a fingerprint sensor. My major concern is that the photo equipment won’t meet the needs of the target audience or, at least, myself.

Also with the phone are ICE = In Case of Emergency, and other assistance features. Personal contact and health data, such as blood groups and allergies, can be stored on the phone so as to be readily accessible. An assistance button on the rear of the phone will initiate a call and an SMS alert, complete with location, to be sent to the programmed telefone number of someone relied upon to respond.

The OS is based on Android 9 (formerly Pie), from 2018. While relatively old, it is also straightforward to use, with most of the bugs worked out. The View button offers a choice of 1) messages, 2) emails, 3) call history or 4) pictures and videos. The Send button takes one to messages, emails, a picture or video or the device’s current location. There are also pre-installed Android/ Google apps: Gmail, Maps and YouTube. Extra apps can be downloaded and installed. All apps are listed alphabetically, while the most frequently used ones can also be displayed on the opening screen for immediate access.

The Doro manual for the device can be downloaded. It provides useful background information and instructions for using apps. While the phone is only available in either black or white, there are cases/ covers available that make the phone more colourful.

Other related products from Doro include: HandleEasy = a seven-button basic remote control for radios and televisions; 3500 alarm trigger = a wristband/lanyard pairable with all Doro mobile phones with texting ability; Tablet = an Android tablet based on the EVA Android interface; Watch = a smartwatch similar to Motorola smartwatches, compatible with Android touchscreen phones; HearingBuds = sound enhancement devices compatible with Android touchscreen phones.

Compromise

It has been difficult to find reliable statistics about dumb and smart hand-held devices. Here are some best guesses from statcounter.com. At some point in 2024, it was estimated that there were about 4.88 billion smartphone users = people worldwide, representing more than 60% of the global population. At the same time, the number of smartphones = devices in use globally was estimated to be about 7.21 billion. It is easier to get percentage data. In 2025-01, more than 72% of hand-held device users, use some variant of Android, while more than 27% use iOS from Apple. This means that far less than 1% use something else. The number of Linux based hand-held devises (including Lineage) is estimated at 0.01%, with Lineage on 1.5 million devices.

The distribution of devices varies geographically. Here are the market shares in % for Android, followed by iOS in 2025-01: Africa = 85.29/ 13.58; Asia = 80.1/ 19.4; In Europe = 65.61/ 33.92; North America = 48.86/ 50.89; Oceania = 47.99/ 50.24; South America = 87.04/ 12.79. So in places like Canada and Australia there are more iPhone users than Android users. Norway follows European trends with a 65.06/ 34.54 split. The poor prefer Android.

One solution is for people to refuse a transition to a new device, such as a Doro by for keeping their existing devices, but updating them with appropriate software. For us that means keeping our current Zenfones, far into the future, potentially beyond the manufacturer’s end of support. There are some sites that claim that support for the Zenfone 9 ended on 2024-07-28. This would come in conflict with European consumer protection rights. So far, there is no indication with our phones, that we have experienced this. I suspect that Asus will not provide us with new versions of Android (we have version 13), but that security updates will be available for at least five years = 2027-07-28, possibly longer.

We have two 2018 Xiaomi Pocophone F1s in storage, that could become test mules for new device operating systems. I am most keen to investigate the e.foundation variant of Lineage OS. One could be installed on one Pocophone, along with the Bald Phone launcher. Hopefully this can be set up at Easter 2025 (with school holidays in the period 2025-04-11 to 21). While doing this, detailed plans for transitioning the Zenfone 9s to these variants can be developed and documented. They can then be implemented when an appropriate time comes.

It should be possible to test out basic programs/ apps and to see if they work better with one or the other variant of Lineage. As the Zenfone 9s age, it should be possible to pick up a third one cheaply, that could be regarded as a spare, in case one of the original ones fail.

Admittedly, people have valid concerns about Lineage as a serious supplier of an OS. At best, their 2018 April Fools Day prank was in bad taste, and is still remembered almost seven years later. It should not have happened. The prank involved a request for users to install specific software that most thought was an update. Once installed the device advised that the prank related software could not be uninstalled, and would be used for coin mining, others would profit by. Official apologies were issued on 2018-04-10. A company with so few users can’t afford to offend anyone with any sort of prank.

To appreciate the next section, it can be important to understand the definition of a launcher, in terms of a computing device. It is an app that changes the user interface of a device’s home screen, allowing the customization of the layout, icons, and overall look of the device. It helps people organize apps, access notifications and personalize their device experience.

BaldPhone is an open-source launcher that targets elderly people. A German psychology and aging journal examined responses from over 14 000 participants in the German ageing survey, who were asked at what age someone becomes “old”? People in their mid-60s generally said 75, suggesting that as people approach old age, they tend to push the marker further back. There is a demographic shift occurring in the world, with decreased fertility, and an increasing number of individuals aged 60 and over. Yes, at 76, I fall into that category myself. According to the United Nations, by 2050, the global population of older adults is projected to reach 2.1 billion.

The standard interfaces of popular mobile operating systems—iOS and Android—typically feature small icons, intricate menus and difficult to understand settings. Some people admit that this can be confusing for older people. Others avoid such comments, and simply say that there is a need for a more user-centric approach to device operating system design.

An alternative to buying a new devise is to replace/ update parts of the operating system, such as the BaldPhone open source interface. Its propaganda states that: BaldPhone offers a clean, straightforward layout that minimizes distractions and focuses on essential functions. With large, easy-to-read icons and a simplified user experience, BaldPhone empowers elderly users to interact confidently with their devices.

Key Features: User-Friendly Interface with oversized icons and a limited number of options per screen. An attention to size ensures that users with limited dexterity can easily tap the desired applications.

Customization: Users, other family members, friends or caregivers can personalize the launcher according to the user’s preferences. This adaptability helps instill confidence in users, as they can easily access their favorite applications and communication tools without wading through unnecessary features.

Essential Applications: users can highlight essential applications such as calls, messages, and emergency services.

Accessibility Features: Built-in accessibility tools, including text-to-speech capabilities and voice commands. This functionality is particularly valuable for seniors with visual impairments or those who may struggle with traditional typing methods.

Emergency Services: Quick-access emergency buttons allow users to reach out to their emergency contacts or dial emergency services just by tapping a single button.

Notification Management: Many older people find frequent notifications from various apps distracting. BaldPhone allows users to reduce or eliminate non-essential notifications, enabling them to focus on what matters most.

BaldPhone is built on the Android operating system. Developers claim it combines the robustness of a mature platform with a specialized user interface for older users. The open-source nature of the project allows developers to modify the code and tailor features, fostering a collaborative community focused on enhancing accessibility.

More information about BaldPhone can be found here.

Accessories

Neither a smart (or dumb) hand-held device, can operate alone. In the European Economic Area (EEA = an extended European Union) users no longer receive a charger when they buy a phone. All new phones, even those made by Apple, connect using USB-C. Currently, both Trish and I have chargers that allow charging through four ports: two use USB-C and connect to our laptops and hand-held devices; two use USB-A to attach to other items, including: portable lighting, Trish’s hearing aid container, uses a USB-micro connector. We have connectors fitted with USB-micro at one end, and USB-C or USB-A at the other end. The USB-C standard applies (or will soon apply) to all smaller devices needing to be charged to operate. We bought our two Acer Swift 3 laptops in the same store, the same day. When they came home, they were fitted with two very different barrel jack chargers, despite having power available through a USB-C port (as well as a barrel jack). We believe these charges with both USB-C and USB-A ports will have a life in excess of ten years, meaning they could be the last chargers we acquire.

Most hand-held device users, strengthen their displays with Gorilla glass, or its equivalent. Wikipedia tell us: The iPhone that Steve Jobs (1955 – 2011) revealed in 2007-01 still featured a plastic display. The day after he held up the plastic iPhone on stage, Jobs complained about scratches that had developed on the phone’s display after carrying it around in his pocket. Apple then contacted Corning and asked for a thin, toughened glass to be used in its new phone. The scratch-resistant glass that shipped on the first-generation iPhone would eventually come to be known as Gorilla Glass, officially introduced in 2008-02. Corning further developed the material for a variety of smartphones and other consumer electronics devices for a range of companies. There have been seven generations of Gorilla Glass produced, the latest being Gorilla Glass Victus 2. We have not fitted this type of product to any of our laptops or screens, and all have survived without issue. We have Gorilla Glass Victus (1) on our hand-held devices.

The third accessory that we have on our hand-held devices is a cover. This allows us to store cards with the device, eliminating the need for any other type of wallet. Our covers are in turquoise (Trish) and pink (Brock), which helps identify the specific device, since both devices are black. My covers always wear out faster than Trish’s. Thus, I bought three covers to last the lifetime of the device. I am still on the first cover. Trish has bought just one, and shows very little wear. One always has to be very selective about covers, especially if other needs than just physical protection are to be served.

The last accessory that is useful with a hand-held device is a stylus (passive or capacitive), that acts like a substitute finger when touching a device screen. With a passive stylus, there is no electronic communication between the stylus and the device. The device treats the stylus as a finger. These are considered less accurate than active styluses.

An active stylus includes electronic components that communicate with a device’s touchscreen controller, or digitizer. These are typically used for note taking, on-screen drawing/painting and electronic document annotation. They avoid the problem of a finger or hand accidentally making contact with the screen.

A haptic stylus uses haptic technology = realistic physical sensations which can be felt. Sometimes these can be enhanced by auditory = sound and tactile = touch illusions. A stylus is particularly useful for typing on a miniature keyboard. My fingers are too large for the virtual keyboard provided with hand-held devices.

It is possible to make an inexpensive passive stylus. My starting point is a dead Pilot V-ball 0.5 mm pen, originally filled with liquid ink. Other pens work equally well. Pens regularly run out of ink, making them useless for writing on paper, but ideal as a passive stylus. At my current rate of writing, I am able to construct a new stylus about every three months. It is repurposing, rather than recycling. To distinguish a pen from a stylus, I use white electrical tape around the clear ink tank on the stylus, so that it differs from the pen. My latest intention is to give them away as holiday season gifts.

Two styluses, former pens, marked with white electrical tape to avoid confusion with pens currently in use.

Note:

Despite my obsession to date people and things, I have been unable to determine when Alastair Curtis was born. However, he received a Bachelor of Science at the Brunel University, then Master’s degree in Industrial Design Engineering at the Royal College of Art in London. He worked as a designer for Nokia starting in 1997. This probably indicates that he was born in 1974 or earlier. He became chief designer, senior vice president and head of the design department at Nokia in 2010. He was chief design officer for Logitech from 2013 to 2024. Logitech’s designs hold great appeal with me, and most of my peripherals are made by them. That includes all of my keyboards, pointing devices, head sets as well as my only desk mat, 30 x 70 cm in lilac! In 2024, Curtis became chief designer for VF Corporation (Vanity Fair Mills in a previous life), known for their outdoor wear, with brands such as The North Face and Timberland. Of which, I use none. This move by Curtis may be related to former Logitech CEO Bracken Darrell becoming CEO of VF Corporation in 2023.

Optics 9

This weblog post is the ninth of a series of about optics and optical equipment. It is about hand lenses, stereo microscopes, compound microscopes and digital microscopes, in that order!

Buntzen Lake looking north from public beach at south end. Photo: Patleahy, 2007-05-28.

In my childhood, my father, Edgar, called Mac (1906 – 1991), would regularly take my sister and I for long walks out in the woods. A favourite location was Buntzen Lake. It is 4.8 km long occupying an area of 182 hectares. Located north of Port Moody about 24 km from our house in New Westminster. These walks were the main source of my continued enjoyment of nature.

We usually circumnavigated most of the lake. Today, after the construction of a suspension bridge, the usual walk is about 8 km long, with which would take 4 – 5 hours, and an elevation change of 100 m. Before the suspension bridge was built, a shorter walk around most the lake took about 3 hours, with limited elevation change, possibly 20 m. The lake has provided hydroelectric power since 1904.

Both of my parents were hunters and fishers. One of my mother, Jennie’s (1916 – 2021) only regrets, was her failure to buy a particular rifle when she had the opportunity. Yet, she travelled the world, visiting in excess of 100 countries. As a parent to an adopted child, I think she was somewhat surprised that I had so little interest in her pastimes. In my childhood I had an interest in birds from pigeons to loons to falcons.

In my early teenage years, I developed an interest in marine biology, especially plankton, but also marine invertebrates. Several books led me in this direction. One of the first was, Rachel Carson’s (1907 – 1964) The Edge of the Sea (1955). This was actually the third of her Sea trilogy, and looked at three edges: rocky, sandy and coral. The focus was on the east coast of North America. The rocky shores were typical of the Cape Ann region of Massachusetts, the sandy shores were of the intermediate coast off the Carolinas, while the corals were part of the Florida Keys.

I also wanted to read something related to the Pacific Coast, where I lived. This was found in Ed Ricketts’ (1896 – 1948) Between Pacific Tides (1939). I am sure I read a later edition than this. Currently, I own a copy of the 5th edition, from 1985.

The third work that inspired me was Ralph Buchsbalm’s (1907-2002) Animals Without Backbones (1938). Originally, I owned a Pelican edition in two volumes from 1957. My current reference copy is from 1987, in a single volume.

I acquired a compound microscope in 1962, for my 14th birthday, As explained in Optics 6, I used this with a camera to take photomicrographs of plankton and other organisms. In high school use of a compound microscope was part of the biology course. When, I studied biology at college, we were required to use hand lenses, stereo microscopes and compound microscopes, in that order.

An Opticron glass hand lens = inspection magnifier, with chrome plated metal protective cover, 10 × magnification, with a 23 mm diameter glass lens. There is space for the attachment of a thin lanyard. Photo: Opticron.

The college biology program I undertook emphasized field trips to various ecological areas, and in all seasons. In addition to other equipment such as skis, we were required to have a pocket lens = inspection magnifier with us that could magnify up to 10 ×, for field trips. The lens can be made of glass or plastic, with plastic being cheaper and lighter but of lower optical quality and more difficult to clean. I prefer glass lenses.

An element is an individual piece of glass within a lens. Hand lenses can be constructed with one (singlet), two (doublet) or three (triplet) lens elements. Each element is specially shaped to correct for a particular type of optical distortion so the more elements, the higher quality the image. A singlet is gudenuf.

A 10 × magnification hand lens is adequate for most purposes. Higher magnification lenses are harder to use. Large diameter lenses provide a wider field of view = they are easier to use but, but are more expensive. I have always used a hand lens 20 – 30 mm in diameter. The description of a hand lens often follows the same convention as binoculars, with lens magnification followed by the lens diameter in mm. Sometimes, the order is reversed.

Some people comment that the use of a hand lens can be challenging. People often start holding the lens close to their eye then a) move the subject closer to your eye (if the object is moveable) or b) move their head with the hand lens, closer to the subject until it comes into focus (is the object isn’t moveable).

Some lenses come with multiple LED-lights for effective illumination in weak light. Some are further optimized with UV lights, so that it is possible to see fluorescence, not just in plants, animals and minerals, but also postage stamps and paper currency.

A lanyard should be fitted to the hand lens. I use braided mason’s twine. It is durable and comes in a variety of bright colors. Mine comes packaged will three 80 m bunts, in red, blue and yellow. Braiding adds strength and reduces tangling. Bonded twine means that it is coated to make it stiffer and more resistant to wear and moisture. Blue is a useful colour, since it is not often found in nature. The lanyard should be put around the person’s neck, so that the lens won’t be lost.

Stereo Microscope

When we returned to our classroom from a field trip, we would re-examine specimens using a stereo microscope. My current Dutch BMS stereo microscope has 10 × eyepieces, with a 2 × and 4 × combined objective, giving a magnification of either 20 × or 40 ×.

This Galilean optical system used here is an arrangement of fixed-focus convex lenses is used to provide a fixed magnification, but with the crucial distinction that the same optical components in the same spacing will, if physically inverted, result in a different, though still fixed, magnification. This allows one set of lenses to provide two different magnifications. Yes, it is important to know what magnification one is using, so that sizes in the microscope can be converted into real world sizes.

In my student days, stereo microscopes were commonly used as an aid for dissection. The major problem with this microscope is that it is not designed to take photomicrographs.

A BMS stereo microscope, with 10 x eyepieces, and 2 x and 4 x objectives allowing 20 x and 40 x magnification, suitable for dissecting biological samples.

Modern stereo microscopes are often coated with anti-bacterial paint. It is thanks to Charles Wheatstone ( 1802 – 1875) that stereo microscopes have two eyepieces. He discovered stereopsis = the fundamentals of depth perception, in 1840, allowing people to see objects in three dimensions. Eyepieces are typically inclined 45º with at least one tube being adjustable ±5 diopter, so that users can adjust the microscope to their eyes once. There should be no need to use eyeglasses while viewing through a microscope. There is a focussing knob, sometimes two for coarse and fine adjustments.

Illumination in a stereo microscope is most often incident = reflected from the surface of an object, rather than diascopic = transmitted through an object. In the new millennium, LED lights have been used for both incident and transmitted illumination. LED lights have a life of about 50 k hours. Usually, they work from mains electricity, 230 and 110 V through a transformer. Many stereoscopes include a rechargeable battery, so they can work offline, for up to about ten hours, with three hour charging times. This means stereo microscopes can now be taken out into the field, something that was not possible in my student years. Today it is common withDouble Pole Double Throw (DPDT) switches offer full separation from the mains power. The claim is safety, but I think part of the reason is related to prevention of battery drain. Power cords are typically detachable, and can be fitted with a C13 connector to the microscope and an appropriate power socket connector. For the model shown in the photograph, length x width x height = 150 x 133 x 358 mm, mass = 2.3 kg

In addition to its uses in biology, I also use my stereo microscope to inspect soldering and to examine electronic circuits.

Compound Microscopes

The compound microscope shown above is a MAGUS Bio 240T, a biological microscope, mainly used for education. It is designed for work with transparent and translucent biological specimens using brightfield illumination. The microscope has a revolving nosepiece suitable for four plan achromatic objectives . The 3W LED illuminator, has an intelligent control system to remember and maintain a different light intensity for each objective. There is an LCD screen, which displays operating parameters. The operational life of the illuminator is 50 k hours = 30 work years = almost a lifetime of work! Color temperature can be changed within the range of 3000 to 7000 Kelvin.

Magnification is from 40 – 1000 ×. The trinocular head has a vertical tube for mounting a digital video camera. There are two eyepiece tubes, both equipped with diopter adjustment rings, The head revolves and can make a full rotation. There are 2 standard 10 × /20mm eyepieces, fitted with rubber eyecups.

The microscope is equipped with coarse (left side) and fine (right side) focusing knobs. The coarse focus lock knob is located on the left side. It is used for adjustment after switching objectives.

The specimen stage is equipped with a belt drive mechanism that gently moves the specimen along the stage. The specimen holder is mounted with two screws. If necessary, it can be removed.

An Abbe condenser is centered along the optical axis. It is fixed, preventing the condenser from being moved accidentally, an important consideration in an educational microscope. The condenser has a numerical aperture of 1.25. Markings indicate objective magnifications. The iris diaphragm is adjusted by a knob. This can be used to increase image contrast, improving specimen visibility.

Selected settings are displayed on the LCD screen. Using a pair of knobs, you can adjust the light, set the sleep mode, or set the auto-off time.

After installing the microscope on the work area, the power supply and power cord can be hidden. The side openings in the stand act as handles allowing the microscope to be carried. or moved around.

Infinity plan achromatic objectives: 4×/0.10; 10×/0.25; 40×/0.65 (spring-loaded); 100×/1.25 oil (spring-loaded); Eyepieces 2 each 10×/20mm with long eye relief; 2 each eyepiece eyecups; C-mount camera adapter; light filter; bottle of immersion oil; power adapter = transformer, and power cord; dust cover; user manual. Additional equipment: Digital camera; calibration slide; monitor. Size (LxWxH) 450 x 300 x650 mm; mass = 9.8 kg; Mass = 9.8 kg

Digital Microscopes

Is a digital microscope a real microscope? This is a question I regularly ask myself, with the prepared answer being no. Unfortunately, for my prejudiced view of the world, digital microscopes are becoming better. Yet, my mindset, formed in the 1950s, still has difficulties accepting these as anything more than a slightly different digital camera, inferior to an optical microscope. The good news is that my world view is slowly changing. I have avoided purchasing a new compound microscope, in the hope that a digital microscope will make that investment unnecessary.

Part of the reason for my initial skepticism was that the first digital microscope I met was being sold as a toy. It was low-powered with a plastic lenses, and used a USB interface to connect to a computer monitor. It was incapable of working like a real compound microscope. A real digital microscope has to be more than a webcam attached to a macro lens.

An acceptable digital microscope should be computer controlled and automated, allowing advanced image analysis. At a minimum, it should be able to find/ calculate distance and area measurements. In a Wikipedia article about this topic, quantitation of a fluorescent or histological stain are mentioned.

What at one time was called USB 3.2, released in 2017, with two lanes of 10 Gbps simultaneously, with a maximum transfer rate of 20 Gbps over a USB-C connector. This is now called 3.2 Gen 2×2 = SuperSpeed USB 20 Gbps = USB 20 Gbps. Whatever its name, this allows a video resolution of up to 4k = 4096 × 2160 pixels, at a frequency of 60 Hz. Variable illumination should be provided with an LED source close to the camera lens. Modern, modestly priced digital microscopes offer magnification from about 20 × to 400 ×, sometimes more.

The real challenge with digital microscopes is obtaining software with device support. μManager provides professional microscopy software able to connect to a large number of devices. Some types of hardware are automatically supported, others not so much. μManager’s open device interface lets anyone write code to control microscope-related equipment, resulting a large and growing list of supported equipment. A scripting interface makes it possible to accomplish tasks that can not be executed within a graphical user interface (GUI).

The advantage of a digital microscope should not just be its low cost, and eyepiece elimination. It should also eliminate the mess and work of staining and preparing slides. With the use of sensitive photon-counting digital cameras, digital microscopy should be able to avoid damaging vulnerable biological samples.

The more advanced digital microscope units have stands that hold the microscope and allow it to be racked up and down, similarly to standard optical microscopes. Calibrated movement in all three dimensions are available through the use of a step motor and automated stage. The resolution, image quality, and dynamic range vary with price. Systems with a lower number of pixels have a higher frame rate (30fps to 100fps) and faster processing. The faster processing can be seen when using functions like HDR (high dynamic range). In addition to general-purpose microscopes, instruments specialized for specific applications are produced. These units can have a magnification range up to 0–10,000×, are either all-in-one systems (computer built-in) or connect to a desktop computer. They also differ from the cheaper USB microscopes in not only the quality of the image, but also in capability, and the quality of the system’s construction giving these types of systems a longer lifetime.

Conclusion. When people think of optical devices for biology, they think that they will get the most value for their money be acquiring a compound microscope. For me, the most value comes from a hand lens. After this, a stereo microscope is the most practical. It is only if one is interested in cellular biology that a compound microscope is needed. In my old age, I have lost interest in preparing slides, so that I can view samples with a compound microscope. Appropriate preparation takes time. Samples have to be cut very thinly. Most often they have to be stained, so that contrasting parts in a cell can be seen. When pressed for advice, buy a good quality hand lens. Use it for a year before even considering anything else. Except for people with an interest in cellular biology, that something else should be a stereo microscope. Use it for at least a year before investing in a compound microscope.

This post was originally written on Friday 2024-03-22 as Optics 2. It was saved, for the first time at 18:10. On 2024-04-09 at 20:06 it was scheduled to be published on 2024-06-29 as Optics 6. Later on 2024-04-25, it was rescheduled for 2024-07-06 at 12:00. On 2024-04-27, it was it was reconstituted as Optics 9, and rescheduled for 2025-01-25

On 2025-03-01, Optics 10 will be published. It is about digital cameras. On 2025-03-08, Optics 11 will be published. It is about photographic collections.

Optics 8

This is a Celestron Nexstar 8SE, often described as a good telescope for beginners/ hobbyists. It is the biggest telescope in Celestron’s iconic orange tube family. Its optics are acceptable, and it gathers lots of light. It is even affordable, at about US$ 1 000. Despite this, I am unconvinced this is a telescope for me. There is too much automation, hiding its operations. I want to know what is happening. Photo: Celestron.

This weblog post is the eighth of a series about optics and optical equipment. Optics 8 is about astronomical telescopes. Optics 9 is about microscopes. Later in 2025, two additional posts will appear: #10 is about digital cameras; #11 is about digital photograph collections.

Scandinavian winters are long and dark. Every year, in January, I go through a period where I considered acquiring a telescope. So far, I have not done so. In large part, it has to do with the reality of cloud cover. Norway is one of the most extreme places in the world for cloud cover. That said, Trøndelag is better than most places in Norway.

There has only been one type of telescope I have ever consider buying: a Celestron 8.

Before discussing it, and other telescopes suitable for amateurs, there is the history of astronomy to endure.

Johannes Kepler (1571–1630) investigated some of the laws of optics in his lunar essay (1600). In 1603, Kepler focused on optical theory, published as Astronomiae Pars Optica = The Optical Part of Astronomy (1604). It described the inverse-square law governing the intensity of light, reflection by flat and curved mirrors, principles of pinhole cameras, the astronomical implications of optics such as parallax and the apparent sizes of heavenly bodies. This work provides a foundation for modern optics, despite the absence of anything about refraction.

In physics, refraction is the change in direction of a wave passing from one medium to another or from a gradual change in the medium. Refraction of light is the most commonly observed phenomenon, but other waves such as sound waves and water waves also experience refraction.

Willebrord Snellius (1580–1626) developed the mathematical law of refraction = Snell’s law, in 1621. René Descartes (1596–1650) used geometric construction and the law of refraction = Descartes’ law, to show that the angular radius of a rainbow is 42°. He developed the law of reflection, Dioptrique (1637, French) = Optics = Dioptrics (both English) a short treatise that was the first published to mention this law.

Christiaan Huygens (1629–1695) wrote several works about optics. These included Opera reliqua and Traité de la Lumière (1690) which presents a wave theory of light. This theory was initially rejected in favour of Newton’s corpuscular theory of light, until Augustin-Jean Fresnel (1788 – 1827) adapted Huygens’s principle to give a complete explanation of the rectilinear propagation and diffraction effects of light in 1821. This principle is now known as the Huygens–Fresnel principle.

Isaac Newton (1643–1727) investigated the refraction of light, demonstrating that a prism could decompose white light into a spectrum of colours, and that a lens and a second prism could recompose the multicoloured spectrum into white light. He also showed that the coloured light does not change its properties by separating out a coloured beam and shining it on various objects. Newton noted that regardless of whether it was reflected or scattered or transmitted, it stayed the same colour. Thus, he observed that colour is the result of objects interacting with already-coloured light rather than objects generating the colour themselves. This is known as Newton’s theory of colour. From this work he concluded that any refracting telescope would suffer from the dispersion of light into colours, and invented a reflecting telescope (today known as a Newtonian telescope) to bypass that problem. By grinding his own mirrors, using Newton’s rings to judge the quality of the optics for his telescopes, he was able to produce a superior instrument to the refracting telescope, due primarily to the wider diameter of the mirror.

In 1671 the Royal Society asked for a demonstration of his reflecting telescope. Their interest encouraged him to publish his notes On Colour, which he later expanded into Opticks. Newton argued that light is composed of particles or corpuscles and were refracted by accelerating toward the denser medium, but he had to associate them with waves to explain the diffraction of light (Opticks Bk. II, Props. XII-L). Later physicists instead favoured a purely wavelike explanation of light to account for diffraction. Today’s quantum mechanics, photons and the idea of wave-particle duality bear only a minor resemblance to Newton’s understanding of light.

In his Hypothesis of Light (1675), Newton posited the existence of the ether to transmit forces between particles. In 1704, Newton published Opticks, in which he expounded his corpuscular theory of light. He considered light to be made up of extremely subtle corpuscles, that ordinary matter was made of grosser corpuscles and speculated that through a kind of alchemical transmutation “Are not gross Bodies and Light convertible into one another, …and may not Bodies receive much of their Activity from the Particles of Light which enter their Composition?”

A Telescope

Selecting the right equipment is vital in amateur astronomy. Optical quality is the most important characteristic of any telescope because it contributes significantly to the clarity of celestial bodies.

The three most commonly used types of telescopes:

1. Refractors: These use glass lenses at the front of the tube, a design often favoured for planetary observations.

2. Reflectors: These use mirrors instead of lenses. These are preferred for observing deep sky objects such as faint galaxies and nebulae.

3. Compound/ catadioptric: These combine lenses and mirrors. They are versatile, and can be used for viewing a variety of celestial objects.

The Tracking Platform

A telescope tracking platform can be built to allow a telescope to move in sync with the rotation of the Earth: to track stars, planets and other celestial objects as they move across the sky. While mechanical components provided timing, increasingly computers provide this today. Part of my interest in tracking involves the Forth programming language.

Everything in the night sky is in constant motion. This is mainly due to Earth’s rotation. Without a tracking system, a telescope, regardless of its quality, will only provide fleeting views. Tracking systems help a telescope to consistently follow celestial bodies, by moving the telescope at a similar rate and direction, mimicking the sky’s movement. This is especially important for astrophotography, that could require long exposure times.

Building a telescope tracking platform requires diligence, patience and persistence. Platforms can be complex and require fine-tuning and troubleshooting. Telescope tracking platforms come in two basic types:

Alt-Az Mounts: They move up-and-down (altitude) and side-to-side (azimuth). These mounts are often easy to use, making them suitable for beginners, for general observation and short-term tracking. They are not suitable for long-term tracking or astrophotography.

Equatorial Mounts: These mounts rotate along an axis parallel to Earth’s rotation. They require alignment with the Polaris = North Star. Equatorial mounts offer superior tracking for longer periods and are needed for astrophotography.

A mount alone doesn’t make a perfect tracking system. Location, local climate, the telescope’s size and weight have to be taken into consideration, along with motor drives and functionality.

Designing a telescope tracking platform is dependent on location. This impacts the effectiveness of a tracking system. A northern hemisphere, polar alignment will be different from that in the southern hemisphere. The altitude and azimuth of the celestial pole vary with latitude and longitude.

Humidity and temperature influence the telescope’s operation. In extreme cold, the lubricants in the mounts may thicken, hampering smooth movements. High humidity can lead to rusting of the metal parts.

The size and weight of a telescope also affect mount choices. Heavier telescopes require sturdier mounts to support their weight.

Maximum weight capacity varies with the mount. A typical amateur alt-azimuth mount can have a mass of 20 kg, while for an equatorial mount this can be 50 kg. While a a casual stargazer, can co-exist with an alt-azimuth mount for general observation, someone actively engaged in astrophotography, will need an equatorial mount, because of its superior tracking capabilities.

While tracking mount components are widely available online, it is sensible to acquire all the necessary materials before starting to build a telescope tracking platform. This prevents surprises and unnecessary delays. The functionality of the platform directly depends on component quality.

Some crucial components include the mount. Metal bars are needed for the base and rocker. Here, durability is important, so consider stainless steel or aluminum. Quality ball bearings used in the pivot point, will ensure smooth rotation of the platform. Motor and Gears should be selected to enable auto-tracking, with an adjustable motor speed capable of matching the Earth’s rotation.

Minimum tool requirements, include a screwdriver set, drill, hacksaw, wrenches and a level.

Before building the telescope tracking platform create a detailed design on paper or using design software. This design should show the mount, metal bars, ball bearings, motor and gears.

Start assembling the components beginning with the mount. Because it forms the core support system for the tracking platform, it must be sturdy and well-balanced. Affix the metal bars to provide a frame that can withstand your telescope’s weight, ensuring better stability and tracking accuracy. Place ball bearings in strategic locations to allow smooth rotation of the platform. Connect gears appropriately to the motor, to ensure their rotation is synchronized with the tracking process.

Integrate a controller into the platform to governs the motor speed and direction. Most modern controllers come with a capability to store star maps and tracking speed presets to make tracking easier and more efficient. Once the platform build is completed, a regular maintenance schedule must be devised, and followed.

Once the telescope tracking platform is assembled, it is crucial to run initial tests to ensure everything functions as expected. Drift alignment aligns the telescope with the Earth’s axis of rotation. Point the telescope at a bright star near the celestial equator and monitor its movement. If everything is correctly assembled and calibrated, the star should stay stationary in your telescope’s field of view. If the star drifts, adjustments will be necessary.

Correct controller calibration is essential to align the telescope with celestial objects. Polar alignment aligns the telescope’s rotational axis with the North Star.

Mechanical issues often involve simple corrections. Ensure the metal bars are firmly in place, the ball bearings move freely, and the gears are properly interlocked. Listen for unusual noises when it is operating. This could indicate a component is not operating as it should.

Power supply issues can cause problems. This applies to both battery and mains power. Ensure the supplied voltage matches the motor’s requirement, and there are no power fluctuations.

The Celestron Nexstar 8SE (shown in the photograph at the beginning of this post) is one often suggested for amateurs. It has a fully automated GoTo Mount, allowing users to select an object they want to observe with a database of 40 000 + objects. At the push of a button, the telescope will automatically point to it. The telescope is compact and Portable. With its Schmidt-Cassegrain optics it has a 20 cm aperture with good light-gathering power allowing views of the moon, other planets and deep-sky objects.

The telescope comes with a built-in wedge to polar align the telescope. A camera adaptor will allow the connection of a mirrorless or digital single lens reflect (DSLR) camera, for astrophotography. Other accessories are available that extend capabilities.

Tom Johnson (1923 – 2012) was an American electronics engineer and astronomer who founded Celestron, a company which revolutionized the amateur astronomy. He served as a military radar technician during World War II. In 1955, he started Valor Electronics, which produced electronics for military and industrial use, in Gardena, near Los Angeles, in California.

Celestron was created as the Astro-Optical division of Valor in 1960. Johnson had been looking for a telescope which could be used by his two sons, but found no child-friendly models on the market at the time. While building a 6-inch reflector telescope in 1960, Johnson encountered a lens-grinding kit. After several days of hand grinding, he invented a machine that would grind the lens for him.

Soon, the company was attempting to build various models of Schmidt–Cassegrain telescopes. However, these proved difficult to mass-produce because they needed Schmidt corrector plates, a hard to manufacture aspheric lens. To solve this production problem, company engineers invented a new type of telescope, the Celestron 8 in 1970, that was compact, affordable and easy to manufacture.

Meanwhile, further north in Watsonville, California, Meade Instruments, founded by John Diebel (1943 – ) in 1972, started selling Japanese telescopes. It became the world’s largest manufacturer of telescopes, starting in 1976. Unfortunately, they used litigation as a means of preventing competition. They ceased operation in 2024, after losing some important lawsuits.

The largest telescope brand in the world is now Sky-Watcher, established in 1999 by the Synta Technology Corporation of Taiwan. It markets telescopes and astronomy equipment, such as mounts and eyepieces, aimed at the amateur astronomy market. The products are manufactured in Suzhou, China. The brand is primarily distributed in North America and Europe.

Buying a telescope is usually an easy and inexpensive way of acquiring an astronomical telescope. However, some people are more inclined to make rather than to buy. Amateur telescope makers (ATMs) build telescopes as a hobby, for personal enjoyment of a technical challenge. They will claim, often to a spouse, that they are saving money, but this is seldom more than an excuse. Sometimes it is done to provide custom features on a telescope, or for research purposes.

John Lowry Dobson (1915 – 2014) was an American amateur astronomer, best known for the Dobsonian (light bucket) telescope, a portable, low-cost Newtonian reflector telescope. He promoted awareness of astronomy through public lectures and sidewalk astronomy performances. His Dobsonian telescope is an alt-azimuth mounted Newtonian telescope design popularized in 1965. It vastly increasing the size of telescopes available to amateur astronomers. Features included a simple, easy to manufacture, mechanical design, using easily available components to create a large, portable, low-cost telescope. The design is optimized for observing faint deep-sky objects such as nebulae and galaxies, with a large objective diameter, short focal length. Their portability allowed travel to less light-polluted locations.

At some future date, I intend to construct a small astronomical observatory, with its own telescope. The major problem with Vangshylla is its lack of clear skies. The Climate Research Unit of the University of East Anglia calculated cloud cover between 1991–2020 in 196 sovereign countries and Greenland. Norway had 81.5% cloud cover. Algeria has the least at 21.0%. There are only two countries with more cloud cover than Norway: São Tomé and Príncipe with 83.5%, while Greenland has 83.7%. Finland, the United Kingdom and Sweden are close with 79.6, 78.4 and 78,2%, respectively.

The following applies to Steinkjer, ca. 35 km north east of Vangshylla. In Steinkjer, the average percentage of the sky covered by clouds experiences significant seasonal variation over the course of the year. The clearer part of the year in Steinkjer begins around 04-07 and lasts for 5.3 months, ending around 09-17. The clearest month of the year is May, during which on average the sky is clear, mostly clear, or partly cloudy 45% of the time. The cloudier part of the year begins around 09-17 and lasts for 6.7 months, ending around 04-07. The cloudiest month of the year is January, during which on average the sky is overcast or mostly cloudy 74% of the time.

Of Canada’s sunniest places = least cloud cover, ranks 9 to 18 (with one exception #13) are cities in British Columbia. These are: # 9 = Kelowna (where my mother grew up), # 10 = Kamloops (where Trish’s sister and other relatives live), # 11 = Penticton, # 12 = Vernon, # 14 = Prince George, # 15 = Abbotsford, #16 = Nanaimo (where my father grew up), # 17 = Chilliwack and # 18 = Victoria.

In my childhood, I visited the The Dominion Astrophysical Observatory, located on Observatory Hill, in Saanich, British Columbia, near Victoria. It was close to the experimental farm, also in Saanich, where my uncle was director. The main instrument is a 1.83 m Plaskett telescope. In its day, the observatory was a world-renowned facility where many discoveries about the Milky Way were made. It was one of the world’s main astrophysical research centres until the 1960s.

The Plaskett telescope was planned to be the largest telescope in the world, but delays caused by World War I, and production errors requiring two regrindings of its mirror, meant it was completed 1918-05-06, six months after the 2.54 m Hooker telescope at Mount Wilson Observatory, in Los Angeles, California.

Note: This post was cloned as Optics 5 from Optics 4 on 2024-03-23. It was saved, for the first time at 8:00. On 2024-04-09 at 20:09 it was scheduled to be published 2024-06-22 at 12:00. On 2024-04-27 it was reconstituted as Optics 8 and rescheduled to be published 2025-01-18 at 12:00. Later that date was changed to 2025-01-25.

Optics 7

Originally, this photo was supposed to be one of Trish’s maternal grandmother’s opera glasses. Immediately before publication, they could not be found for a photo shoot. So this pair, appearing in Wikipedia were substituted. Early 20th century mother of pearl opera glasses & leather case. The glasses are marked with the name of the vendor “Ryrie Bros. Toronto” Photographer: Sobebunny, 2009-12-26.

This weblog post is the seventh of a series about optics and optical equipment. This post is about binoculars, but also includes opera glasses, monoculars and spotting scopes. Future posts: # 8 is about astronomical telescopes; and, #9 is about microscopes. Later in 2025, two additional posts will appear: #10 is about digital cameras; #11 is about digital photograph collections.

Binoculars specification values include information about strength (magnification power) and size (objective lens diameter). These are typically designated with two numbers, such as 8×40, where 8 is the magnification power while 40 is the diameter (in millimeters) of the objective lenses = the lenses closest to the object being viewed. Objective lens size tells: 1) how big physically the binoculars are, and 2) how much light they can gather. These numbers fail to provide information about the quality of the optics or other features such as: rubber protective covering, waterproofing and fog-proofing or – often importantly – type of prism and type of glass.

Binoculars provide users with a three-dimensional image because each eyepiece presents a slightly different image to each eye and this parallax = displacement = difference in the apparent position, allows a viewer’s visual cortex to generate an impression of depth. Monoculars are unable to achieve this.

Most early binoculars used Galilean optics = a convex objective and a concave eyepiece lens. It presented an erect image but with a narrow field of view and low magnification. This construction is still used in cheap models and opera glasses. Aprismatic binoculars with Keplerian optics = twin telescopes, provides each tube with relay lenses to erect the image. A relay lens is a lens or lens group that inverts an image and extends the optical tube. Relay lenses are found in refracting telescopes, endoscopes and periscopes to extend system length. This is done before the eyepieces, which is used to invert an image.

An endoscope = an inspection instrument composed of image sensor, optical lens, light source and mechanical device, which is used to look deep into the body by way of openings such as the mouth or anus. A periscope = an instrument for observation over, around or through an object, obstacle or condition that prevents direct line-of-sight observation from an observer’s current position. In its simplest form, it consists of an outer case with mirrors at each end set parallel to each other at a 45° angle.

In optics, an erect image is one that appears right-side up. An image is formed when rays from a point on the original object meet again after passing through an optical system. The opposite of an erect image is an inverted image.

When lenses are part of a computing device, there is no need to transform images to form an erect image. This can be done efficiently by a graphics processor inside the device, before the image is shown on a screen.

Classification of binoculars and related optical instruments

Indoor vs outdoor: Opera glasses are designed to bring clarity to indoor/ theatrical experiences, while most of the other instruments, including binoculars, are designed to bring clarity to outdoor experiences.

Size of objective lenses: Binoculars are categorized into compact, midsize and full-size models, based on the size of their objective lenses. Variations in optics, design and construction can mean that models with the same size objective lenses will differ in bulk and weight. Some binoculars are bulky so that they will float.

Bulk: Lightweight compact models make sense for hiking. Midsize models with larger objectives, are bulkier, but provide brighter images and can be more comfortable to hold for long periods. Full-size binoculars can be useful in low-light conditions, but are heavier, and less comfortable, without a tripod.

In comparing lenses it is often useful to compare the area occupied by the objective lens, not its diameter. Thus, Area = π · r2 where π = 3.14 and r = 20 mm for a 40 mm diameter lens. This gives an area of 1256 mm2. but r = 40 mm for an 80 mm diameter lens, gives an area of 5024 mm2 which is 4 times as much.

In an article about binoculars for old people, it was suggested that consideration should be given to three features that younger people may be able to ignore. First, use light weight binoculars, substituting a roof prism, for a Porro prism; second, to maximize eye relief, buy binoculars that have an increased distance between the eye and the eyepiece to see the full image; third, decrease magnification from 10× to 8× or even 6×.

Prisms

Isaac Newton (1642 – 1727) used a prism is to disperse white light into component parts. Dispersion is of limited interest when it comes to using prisms in the real world. There is no perfect prism, but two have become standard in binoculars. The Porro prism is named after Italian Ignazio Porro (1801-1875) who invented it ca. 1850. The roof prism is a later invention.

Since the 1960s, handheld binoculars with roof prism-based reversing systems have become increasingly popular. Yet, these prisms are also problematic. Their use has led to a loss of image resolution, which turned out to be the consequence of unwanted interference: the total reflection occurring in the prisms causes a partial polarization of the beam. This beam is then split at the roof edge, with the two half-beams being reflected in different directions. After all the partial beams have been combined, however, their polarization vectors point in different directions, which corresponds to a phase shift and leads to a loss of resolution via the interference effects mentioned above. This phase shift occurs in perfectly manufactured prisms!

Binoculars with roof prisms are more compact and streamlined, lighter weight, and much easier to carry around than binoculars with Porro prisms. Roof prisms are more complex because there is no easy horizontal offset. Roof prisms take advantage of intricate and convoluted machined paths that reflect the light from the objective to ocular lenses. In 2025, people should choose roof prism binoculars, if they can afford it.

Usage:

Binoculars for backpacking and hiking should be small and light weight. Suitable models are often referred to as compact binoculars, with magnification of 8 or 10, and an objective lens diameter less than about 28 mm. Rubber coating, water resistant or waterproof will be appreciated.

Binoculars for wildlife viewing also include those used on safaris and for whale watching. They use a higher magnification (10 rather than 8) if one is likely to be located far away from the animals. Midsize (32mm) is preferred rather than full-size (42mm) if one wants something more compact. Water resistance is useful, but waterproof models are preferred for whale watching (or any other watching) from a boat.

Birding is a sub-genre of wildlife watching. Many birders are less concerned about size and weight, preferring midsize or full-size models such as 8×32 or 8×42. A 10-power magnification will have a narrower field of view compared to an 8 power pair. A wide field of view is useful for locating birds, and other wildlife that is moving. Water resistance is also a good feature, as are those that limit fogging when they go from a warm vehicle to cooler outside environments.

Binoculars on boats should have a low magnification (8 or less) because boat movements can make steady viewing challenging. Waterproof models are useful. 8×32 is a popular size.

For stargazing, one should maximize magnification and light gathering capabilities by choosing full-size binoculars: preferably, 10×50. With higher magnification, a tripod will be necessary. Telescopes for astronomy will be discussed in Optics 8.

Opera glasses = theater binoculars = Galilean binoculars, are compact, low-power optical magnification devices, usually used at indoor performances. Magnification power below 5× is usually desired to minimize image shake and maintain a large enough field of view. A magnification of 3× is preferred. The design of many modern opera glasses of the ornamental variety is based on the popular lorgnettes of the 19th century. Often, modern variants are equipped with an LED flashlight, allegedly to help people find their place in the dark.

Binoculars at Cliff Cottage

Our oldest optics are opera glasses previously owned by Trish’s maternal grandmother, made by (or at least labeled by) G. E. Trorey, in Vancouver. The Trorey jewellry company was started in 1893, and is famous for building what later was called the Birks clock. It was made to celebrate Trorey’s fifth anniversary. Birks bought Trorey in 1906. So, these opera glasses date from before then.

In the early 1980s we purchased a pair of Tento BPC 7 x 50 binoculars, in Molde, Norway. Tento = Technointorg = the Russian Ministry/ Office of Foreign Trade selling Russian optics outside the Soviet Union. БПЦ = BCP = бинокли призменные с центральной фокусировкой = binokli prizmienne s centralnoj fokusirowkoj = binoculars with central focusing. These were most likely made by Загорский оптико-механический завод = ZOMZ = Zagorsk Optical-Mechanical Plant, in Sergiyew Posad, 70 km north-east of Moscow.

On 2024-04-18, we decided that we needed better optical equipment. Yes, age may have been part of that decision. Since we have a optician in Inderøy, we decided to patronize them and buy a Breitler Ultima 10 x 42 binoculars. Specifications are provided below:

Brand/ model:Breitler Ultima
Objective diameter (mm):42
Magnification (x):10
Field of view at 1000m (m):101
Near limit focus (m):2
Exit pupil:4.2 mm
GlassBak-4
Prism typeroof
Features:Waterproof, Nitrogen filled , Protective covering
Product type:Binoculars
Diameter front lense (mm):42
Dimensions (L x W x H, mm)147 × 129 × 63
Mass: (g)670

Monocular vs Spotting Scope

Monocular scopes share most features with spotting scopes, but in a smaller size and with less power and capabilities. For instance: Head: It provides support for the eye piece and connects it to the objective lens. Nose piece: A rotatable component that holds and selects the active objective lens. Eye piece = ocular lens = the lens nearest to the eye which you look through to observe objects. Objective lens: They are usually 2 or 3 lenses, with a prism in the middle that folds the optical path and extends the entire optical system length for an erect image. Arm: supports the head and connects it to the base. Focusing ring: is located around the body in some monocular scopes. Some use a focusing lever or slider button instead. Zoom is optional on monocular scopes. Those that have them have a magnification that varies from 4× to 12×, with 8× considered standard.

Most spotting scopes allow the user to alternate between fixed length and zoom magnification. Zooming allows users to find objects at low magnification, and then to narrow the field of view and magnify the object to observe the details. A variation from 20× to 60× is common.

Spotting scope characteristics. Eye cup: A small and twist-up shield for the eye. Eye cap: A flexible rubber shield that protects a user’s peripheral vision against light, wind and dust, prevents glare caused by ambient light on the ocular lens, and limits distractions. It is also known as eye shield. You can extend or contract it to use it with glasses, or without. Ocular and objective lens caps: These are rubber coverings that protect the lenses from damage from water, dirt and impact during transit or in storage. Some of them can be flipped up. Focusing ring: You twist it to the right or left to adjust the focus for a clear picture. Some models have focus knobs instead of focus rings. You will also find dual focus spotting scopes with coarse and fine adjustments.

My formal training as a biology teacher began when I was over fifty. The emphasis was on observation in the field, which included instructions as to how to use a spotting scope for the observation of wildlife, particularly birds. Since a spotting scope comes with large objective lens and higher magnification, a fold-able tripod is needed to mount it on for support and to lessen the scope’s vibrations for steady viewing. In addition, they often have a lens hood = lens shade, to minimize objective lens glare in sunny conditions. These should be retracted in dim environments.

We have a Breitler Pant(h)er 20-60×60 45 degree spotting scope. It was a demo being sold for less than half price. Since we have an unloved tripod, that is no longer being used to hold a camera, it is being used as a support device for the spotting scope. Specifications are provided below:

Brand/ model:Breitler Pant(h)er
Objective diameter (mm):60
Magnification (x):20-60
Field of view at 1000m (m)/ :31.6-16/
Near limit focus (m):6
Exit pupil:3 – 1 mm
GlassBaK-4
Prism typeunknown
Features:Waterproof, Nitrogen filled , Protective covering
Product type:Spotting scope
Diameter front lense (mm):60
Dimensions (L x W x H, mm)340 x 90 x 165
Mass (g):873

Summary

People considering buying a new pair of binoculars may want to consider a spotting scope as an alternative, it is better for some activities, such as viewing wildlife from land. In terms of binoculars, a Porro prism with Bak-7 glass may be preferred if expense is an issue. These will be cheaper. In addition they offer greater clarity and a wider field of view. Most often they will be heavier, and not as durable. Binoculars with roof prism and Bak-4 glass will be more expensive, but offer: more durability, lighter weight, more compact dimensions, superior waterproofing, greater magnification strength, but they will have less clarity and a narrower field of view.

This post was originally written early in the morning of Monday 2024-03-18, under the title Magnification, that was later in the day changed to Optics 3. Saved, for the first time at 18:10. On 2024-04-09 at 20:08 it was scheduled to be published 2024-06-15 at 12:00. On 2024-06-27 it was changed to Optics 7 and rescheduled to be published on 2025-01-11 at 12:00. Somewhat later it was rescheduled to 2025-01-18. Additional content was added during the week immediately before publication.

Optics 6

A Kodak Petit camera, identical to the first camera I used.

Since the first set of 5 weblog posts was published about optics, I have made some changes about upcoming post content. This weblog post is the sixth in a series. It is about analogue cameras; Future posts : #7 is about binoculars, but also includes opera glasses, monoculars and spotting scopes, scheduled for publishing 2025-01-18; # 8 is about astronomical telescopes, scheduled for publishing 2025-01-25; #9 is about microscopes, scheduled for publishing 2025-02-01. Later, two additional posts will appear: #10 is about digital cameras, scheduled for publication on 2025-03-01.; #11 is about digital photograph collections, scheduled for publication on 2025-03-08.

As I started preparing to write this post, my mind was comparing analogue photography with riding a horse. Something obsolete. However, I then began to remember that much of my photographic career was dependent on using specific types of cameras for very specific purposes. So yes, this is about analogue = film cameras.

My main objection to film cameras is the expense of film purchase and processing. In contrast to the camera that comes with almost every handheld digital device = smartphone, the marginal cost of taking an image is almost zero. Admittedly, those images have to stored somewhere, hopefully in multiple places.

Reminder: The 3-2-1 rule/ data protection strategy = save three copies of data, stored on two different types of media, with one copy kept off-site. This is very expensive to do with analogue photographs, especially if the negatives are missing. Flatbed and slide scanners can create digital copies inexpensively.

In general, a high quality colour image of 2 400 x 4 000 pixels can be reduced in size, without reducing quality, to occupy less than 1 MB, A 10 TB hard disk costs NOK 3 500 or less, and can store 10 million of these images at a cost of about NOK 0.00035/ image. That is about 1/300 of a cent/ image. Yes, one should keep three copies for backup purposes, which raises the cost to 1/100 of a cent.

Film

Unprocessed film is a perishable product that can be damaged by high humidity and high temperature. Fresh film is better able to provide true colour. Film should remain unopened in its original canister or plastic wrap. To protect against humidity, include a silica gel desiccant bag in the film storage container. Reuse of desiccant bags is possible.

Yet, because of a digital revolution, it was not always possible to buy film, so people attempted to preserve the film they had available. The situation in 2025, is much improved compared to 2005. In much the same way that some people prefer to listen to antiquated LP records played on turntables, some people like to revert to antiquated film technology.

Film that is expected to be used in less than 6 months, could be stored in a refrigerator at 8°C or lower. For longer term storage, years rather than months, a freezer can be used at -18°C or lower. Before use, film stored in a freezer, should be placed in a refrigerator for 24 hours. Film removed from a refrigerator, should be given 2 hours or more to adjust to room temperature. At one time, even after I had gone over to digital cameras, I had stored rolls of 35 mm Fuji slide film in our refrigerator, just in case. Currently, our refrigerator hosts one undated roll of Illford Pan F film.

Film available in 2025

Here are some notable films being manufactured

ADOX 20, a black and white film with ISO 20. “No other film is sharper, no other film is more finegrained, no other film resolves more lines per mm (up to 800 L /mm).”

Fuji one of only two remaining major manufacturers of colour film. The film range currently comprises: Consumer films = FujiColor/ FujiColor Superia and Professional films = Neopan, Velvia and Provia. Instax is a range of instant films and cameras launched in 1998 which now outsell the traditional products.

Kodak was established in 1888 and is the other major manufacturer still producing colour film. While these are manufactured in Rochester, New York, since its bankruptcy in 2012, distribution and marketing is controlled by Kodak Alaris, a UK based company, acquired in 2024 by Kingswood capital management. The film range is divided into Consumer films = ColorPlus & Gold/Ultramax, and Professional films = Tri-X, T-MAX, Ektar, Portra & Ektachrome.

Background

Analogue cameras are dependent on using light to create an image on film. Film is used as a generic term, because sometimes a film emulsion is placed on glass plates, or even a piece of metal. At Vangshylla, one of the local farmers has a hobby of using a 4″ x 5″ camera, with glass plates to produce images. I have assisted him once, to take a photo of his family, with him in it.

Personally, I have no intention of reverting to analogue photography again, even if I have stored the equipment needed to develop film in our attic. These items are historical objects to show people how humanity has progressed, technically.

Twentieth century photographic techniques

Photography uses a lens to capture a lighted image on a photographic plate in a camera. It is analogous to images passing through the lens of an eye, to create an image on the retina. In much the same way that it can be advantageous to live in a lighted room, sometimes additional light is needed to create a suitable photograph. A common approach is to attach an electronic flash. An important characteristic of photographic film is its light sensitivity, revealed in its ISO number. The actual amount of light hitting a photographic plate is determined by the shutter speed and the aperture opening. Both of these can be adjusted on most advanced modernish analogue cameras.

Shutter speeds are expressed as fractions of a second, a higher shutter speeds means a faster shutter speed, a shorter period of time that an aperture is open. Typical values are shown in the image below, typically starting with 1/1 000 and ending with 1/2 s. Most films did not produce normal results if the film speed is longer than 1/15 s. Speeds of 1 s or more are often referred to as time exposures. Shutter speeds of a longer duration, introduce blur in objects that are in motion.

Aperture openings are also stated numerically. The smaller the number the bigger the opening. Therefore, an f/1.4 is a very large opening while f/22 is a very small opening. A small opening is necessary to provide focus at depth. Large openings have a very limited focal depth.

Some people like to describe exposure in terms of a bucket being filled with water. The aperture is analogous to an adjustable hole that opens and closes at the top. The duration of the hole remaining open is analogous to a continuous stream of water entering the bucket. The smaller the hole at the top, the longer the hole would need to be open to fill the bucket with water. Conversely, a larger hole will need a shorter amount of time to fill the bucket. So in this analogy the water pouring in represents light. A full bucket of water is a properly exposed image.

Film speed is the measure of a photographic film’s sensitivity to light. The ISO, referring to the International Organization for Standardization, system was introduced in 1974. It combined a linear ASA = American Standards Association, scale used in the United States, with a logarithmic DIN standard 4512 by the Deutsches Institut für Normung used in Europe. Almost since its introduction, the DIN component has often been dropped.

The ISO arithmetic scale means that a film with an ASA 200 rating needs only half the light of a film with an ASA 100 rating. However, film with a higher rating, produce images with more grain.

Both the ASA and DIN systems have a long history, and many revisions. DIN goes back to at least 1934, but with links to The Scheinergrade system devised by the German astronomer Julius Scheiner (1858–1913) in 1894. It uses a logarithmic scale. My direct experience of ASA starts with PH2.5-1960, but versions of it date back to 1943.

Daniel Peter of Fotoblog Hamburg created this free downloadable cheat sheet card for photographers providing a basic overview of aperture, ISO values and shutter speeds.

The camera I remember best from my youth, shown at the beginning of this post, was my mother’s Kodak Petite, made between 1929 and 1934. Since she was born in 1916, I imagine it was purchased towards the end of this time period. It was equipped with a bellows, and used 127 film, which was 46 mm wide. It produced negatives that were 40 x 60 mm. The reason I used this was to take photomicrographs, especially of small marine and fresh water plants that I had collected trailing a plankton net behind my 2.4 m long Sabot sailing dinghy. I used black and white film exclusively. This camera rested on the eyepiece of my microscope, while a 30 second time exposure was made.

More modern cameras suitable for use with modern microscopes will be discussed in Optics 9 Microscopes, scheduled for publication on 2025-02-01.

In grade 12, I was tasked as the official student photographer for my high-school newspaper and year book, I used a 35 mm camera for the first time. It was a Contax II, that was used with Kodak Tri-X black and white film with high speed 400 ASA, which was purchased in 35mm x 100 foot lengths. In the school darkroom, I would load film onto cassettes. Because the camera lacked flash possibilities, dark situations often required that I push the film to ASA 1600, and develop it accordingly. Unfortunately, this increased the film’s grain structure. Kodak, attempted to market this as adding a level of realism to photographs. I was never convinced.

I developed all of the film I used, then spent numerous hours making prints with an enlarger. Working alone, and in the dark, except for a red light, this was the process I liked the most. In essence an enlarger is a camera with a light at the top, that projects a negative image onto photographic paper. One could set the aperture opening, and the exposure time. In this case there was a large timer that controlled the light inside the enlarger, then counted down the seconds before turning off the light. It was 1960s automation.

I am not sure how many times I visited the local camera shop on 6th Avenue in New Westminster. I was interested in an Asahi Pentax Spotmatic camera. It used a Through The Lens (TTL) centre-weighted light meter. This camera allowed one to focus the lens at maximum aperture with a bright viewfinder image. After focusing, a switch on the side of the lens mount stopped the lens down and switched on the metering which the camera displayed with a needle located on the side of the viewfinder. This stop-down light metering was innovative, but it limited the light meter, especially in low light situations. A M42 screw-thread lens mount was used to accommodate high quality Takumar lenses. This meant that it took considerably longer to change lenses, than with a bayonet mount.

Exakta VXIIb

Unfortunately, I could never afford to buy a Pentax. Some years later, in 1973 or so, I bought my first camera, a used Exakta Varex IIb called an Exakta VXIIb in USA. It was a 35 mm camera, produced between 1963-67, and referred to as version 6. Film speeds could be set on the top of the shutter speed dial. Shutter speeds followed the modern geometric progression from 1/30 to 1/1000 second. The rewind knob has a crank handle, but there was no view-finder release knob. The camera did not have a built-in metering system, but I had a hand-held light meter. Despite, this limitation, I especially liked the camera for two reasons. First, it came with a bayonet mount system for interchangeable lenses. Second, and more unusually, it allowed interchangeable view-finders. I had two, a pentaprism as used on most 35 mm cameras, and another one that allowed viewing from the top. This was especially useful when I studied archaeology, because it could take photos of an archeological excavation’s stratigraphy = cultural layers.

Exakta is no longer a recognizable brand, but James Stewart (1908 – 1997)/ L. B. Jefferies and Grace Kelly (1929 – 1982) / Lisa Fremont used one attached to a Kilfitt 400 mm f5.6 lens in Alfred Hitchcock’s (1899 – 1980) Rear Window (1954). It is one of the most iconic cameras in film history.

With this new camera, I switched to Illford Pan F black and white film with ASA 50 film speed, which, with its fine grain, suited my personality better. I have never been a user of colour negative film, but with this camera used Ektachrome, a slide/ transparency film that was developed by Kodak in the early 1940s. It allowed both professionals and amateurs the opportunity to process their own films. I always used the ASA 64 version, because it gave better results than High Speed Ektachrome, announced in 1959 with ASA 160. At the time, many north Americans were Kodachrome enthusiasts. However, it required professional processing.

Ektachrome processing is simpler, and small professional labs could afford equipment to develop the film. I used the E-6 process variant. which allowed amateurs with a basic film tank and tempering bath to maintain the temperature at 38 °C, to obtain suitable results.

Yashica FX-2

A Yashica FX-2 35 mm Camera Photo: Joe Haupt

After Trish and I married, we bought ourselves a modern Yashica FX-2 35mm single lens reflex (SLR) camera. This type of camera was manufactured in Japan, starting in 1976. It was Yashica’s second camera to use the new bayonet lens mount known alternately as the Contax/Yashica = C/Y mount. The intended advantage was that one could start off with inexpensive Yashica lenses, then progress to better quality Contax lenses when finances allowed it. In reality, one stuck with Yashica lenses because they were more robust than the more delicate Contax lenses.

Its viewfinder provided 0.89x magnification and nearly 90% field of view. The through the lens (TTL) light meter used Cadmium sulfide (CdS) as a photoconductive material in its photoresistor. The film speed could be set from 12 ASA to 1600 ASA. There is not much film that is sold over 400 ASA. However, this allowed users to set the film speed that the film woud be developed at. This is an important characteristic for that group of people.

The results of this metering was shown with a needle on the right side of the viewfinder display. When a proper shutter speed and aperture opening combination was selected, a proper exposure would result with the needle between the + and −. Manual focus is provided by manually turning the lens to the left (closer) or right (further away).

One could always see how a particular aperture opening would affect focus by pressing a depth of field preview button, located at the base of the lens. The light meter was powered with a 1.3v mercury battery (EP-675R, RM-675R, or equivalent) located underneath the camera, Its housing could be opened with a coin. Yes, younger readers may need to understand that smaller units of currency involved small round pieces of metal that were often carried in wallets or pockets. These days it would be considerably easier for me to find a slotted screwdriver than a coin. Yes, technology changes.

It was a very easy camera to use, and conventional for the period. The focal plane shutter operated from 1/1000 to 1 sec and B = bulb. If one needed a longer shutter opening than 1 second, then one set it to B, and held the shutter open for as long as one wanted. To use it properly, the camera would have to be mounted on a tripod, with the exposure made using a cable release. A flash could by synchronized by using X sync at speeds from 1/60 and slower. A self-timer was built into the camera. This gave the photographer about 10 seconds to position her/himself.

The mass of the camera body (without lenses) = 690 g. Camera dimensions were: 144.5 x 94 x 51 mm.

With this camera, my darkroom career almost ended, although I continued to develop black and white film. For colour slides, we increasingly used Fujichrome, usually Fujichrome 64.

Tripods

While I have learned to brace myself to take photographs without tripods, they are useful! Of course, to take full advantage of a tripod, one should use a cable release = threaded cable release = a device used to actuate the shutter of a camera without touching the shutter button. It consists of a flexible wire moving within a sheath, with a threaded connector on one end, an a plunger on the other. The sheath is usually vinyl. It is purely mechanical, in contrast to an electronic remote shutter release.

At one time I found a used tripod on sale for less than a reasonable price, I bought it, so that I could give it away to someone with an unmet and often unrealized need. So, my gift suggestion is for people to stock up on unusual, inexpensive used items that can be given away. They make much nicer gifts than yet another box of chocolates that people will despise you for because they went up an additional 100 g, eating those unnecessary calories!

This is the last camera that will be discussed here. We bought one additional 35mm SLR camera before entering the digital age. It claimed to be more modern, but my favourite camera will always remain the Yashica FX-2.

In much the same way that I have an aversion of audiophiles, who claim to hear music much better than their anatomy is capable of perceiving, there are terms to describe two types of annoying users of cameras. In Swedish, the term linslus with lins = lens & lus = louse, refers to someone obsessed with being photographed. In English, that person could be called a lenslouse. A person obsessed with taking photographs is a shutter bug. It is a kinder term, if only because I include myself in that category. These two terms cover annoying people in front of, or behind the camera, respectively.

Yes, I would like to encourage other people to share their thoughts/ experiences about analogue photography, by making comments or sending an email to me.

On 2024-04-27 this post was scheduled to be published on 2025-01-04 at 12:00. Sometime later that was changed to 2025-01-11 at 18:00.

Brooks Stevens (1911 – 1995)

The photographer, George Hunter (1921 – 2013) presumably in the background standing beside the Willys Station Wagon, described this photo as: “1949-07-00 Eileen O’Rourke and Rita Kennedy rest at the base of a sign for the Glynmill Inn, Corner Brook, Newfoundland and Labrador.” The reason for putting this expired copyright work here is precisely because of the Brooks Stevens designed all steel wagon, with its fake woody body, It is one of my favourite car designs, and the only American car I have attempted to buy. In the 1960s, it was beyond my meager budget. Alasdair and I stayed at the Glynmill Inn in 2024.

This weblog post is being published on the 30th anniversary of Brook Stevens’ death on 1995-01-04.

I did not start out admiring Brook Stevens. Instead, I admired a car, then found out that it had been designed by Stevens. In my opinion, Stevens was more of an industrial designer than a vehicle stylist. He was working with companies that could afford to use complex processes in large, well financed plants, as well as other companies that often had to outsource processes that larger companies would perform internally. He also had to know how the products he designed would be used. That said, not all of his designs were successful. The worst one was the Jeepster!

Stevens had a very severe case of polio at the age of eight. For a time he was unable to walk without help, but after a regimen of exercise (especially swimming) he regained almost full mobility. He always had a limp and suffered stiffness and pain in one half of his body. Towards the end of his life, more severe symptoms returned and he was eventually forced into a wheelchair. Stevens was encouraged by his father to draw while confined to bed. He studied architecture at Cornell University, at Ithaca, New York, from 1929 to 1933. Then, in 1934, he established a home-furnishings design firm in Milwaukee.

When he started working with vehicles in the mid-1940s, after the end of the second world war, he was working with unconventional products, including Harley-Davidson motorcycles in Milwaukee, Wisconsin, and Willy-Overland jeeps in Toledo, Ohio. He later worked with Studebaker, in South Bend, Indiana.

Stevens role at Willys was subordinate to Chief Engineer, Barney Roos (1888 – 1960). Roos had served as Studebaker’s head of engineering from 1926 to 1936, specializing in straight-eight engines. He later worked for the British Rootes Group in the design of Humber, Hillman and Sunbeam Talbot cars. Before World War II, he returned to the United States, where he co-designed the Willys MB.

The Willys MB and the Ford GPW, were both formally referred to as the U.S. Army truck, 1⁄4‑ton, 4×4, command reconnaissance. They were commonly known as the Willys Jeep, Jeep, or jeep. They were sometimes referred to by its Standard Army vehicle supply nr. G-503. Whatever they were called, they were highly successful American off-road capable, light military utility vehicles. 640 000 were built to a single standardized design, for Allied forces in World War II, from 1941 until 1945.

Much of the success of Willys jeep was its use of the L134 Go-Devil engine. This was probably the company’s greatest asset.

The engine started life as a less than impressive 36 kW 4-cylinder automobile engine. Roos increased its performance and durability beyond the Quartermaster Corps specifications of 115 Nm of torque at the rear axle. The extra power made it the engine of choice for the U.S. Army. It ended up as a 2.2 litre undersquare = 79.4 mm bore x 111.1 mm stroke engine with an L-head design with valves parallel with the cylinders. Initial power output was 45 kW at 4000 rpm and 142 Nm of torque at 2000 rpm. Compression = 6.48:1. In the Utility Wagon power was increased to 47 kW. This L134 engine was phased out by the F-head Willys Hurricane engine beginning in 1950.

An oversquare engine has a bore that is larger than its stroke. This allows larger valves and more space for air and fuel to enter and exit the cylinders, potentially leading to higher horsepower at higher RPMs due to the increased airflow. Undersquare engines with a longer stroke typically produces more torque at lower RPMs, which can improve its pulling power.

Stevens was influenced by he streamlined style of Stevens’ early work owes a great deal to New York designers who emerged in the 1920s, such as Walter Dorwin Teague (1883 – 1960), Norman Bel Geddes (1993 – 1958), Raymond Loewy (1893 – 1986) and Henry Dreyfuss (1904 – 1972). He was more especially influenced by European custom automobile designer, Alexis de Sakhnoffsky (1901 – 1964) born in Kyiv, Ukraine. Stevens owned a car, made by the Cord Automobile Company of Indiana, that Sakhnoffsky had customized. He met Sakhnoffsky in Chicago in 1934. He was impressed to learn that automotive design could pay over $300 a day. Stevens early car designs were sometimes quite reminiscent of Sakhnoffsky’s creations, that emphasized streamlining. For a short period in the 1950s Sakhnoffsky worked for Stevens as a staff designer.

Stevens had his office in Milwaukee because of its proximity to manufacturers. Allen-Bradley, Harley-Davidson, Cutler-Hammer, and Edmilton were among his early clients, that were all located in Milwaukee.

He worked more as a sales person, than a designer. He had a support staff of about twenty. After being hired to design/ redesign a product, he would usually make some quick sketches, showing the basic lines of the new design, with an emphasis on the product’s look and function. A staff designer would make renderings from these sketches, often using an airbrush or marker. Stevens would then meet with the client and decide which version was best. Next, a three-dimensional model, either at scale or in full size, would be made by an in-house model-maker. Initially sculpting clay was used, but in the 1960s the firm used cast fiberglass. Then, the model was painted/ embellished.

I share one trait with Stevens, a love of peanut butter. However, he was frustrated by jars that had a too narrow neck, so it was impossible to get all the peanut butter out of the jar’s shoulders. Stevens insisted on a wide mouth.

When Willys’ made their prototype jeep shortly after USA entered World War II, it was overweight compared to Army requirements, but the powerful engine and its heavier and more robust transmission, was a power train combination that proved to be beneficial in the long-run for use in cross-country travel.

Willys did not have their own facilities for automotive bodywork. Their sheet metal manufacturing processes were outsourced to appliance manufacturers. These fabricators were restricted in terms of the shapes they could produce. Body panels could not have an offset of more than about 150 mm. In sheet metal drawing, a die forms a shape from a flat sheet of metal, the blank. This material is forced to move and conform to the die. Pressure is applied to the blank and lubrication applied to the die or the blank. Wrinkles will occur in the part, if the blank moves too easily. The correction is more pressure or less lubrication to limit the flow of material and cause the material to stretch. Too much pressure will result in the part becoming too thin, leading to breakage. Drawing metal requires finding a balance between wrinkles and breaking to produce a useful part.

To be a successful automotive designer, Stevens had to understand many different production processes: conventional sheet metal drawing, deep drawing, piercing, ironing, necking, rolling and beading.

Confession: I have attempted to buy one American car in my life, a 1956 Willys Station Wagon (rear-wheel drive). Unfortunately, my savings were just too little to afford it. I have also been attracted to some American vehicles produced by: International Harvester, Rambler, Studebaker and Willys. The Big Three = General Motors/ Ford/ Chrysler? Not so much.

This vehicle was in production from 1947 as the Willys Station Wagon (model 463), using the same engine and transmission, and with clear styling influence from the CJ-2A Jeep. The CJ in the model description refers to civilian jeep. In 1948, a Jeep Utility Truck emerged. Both it, and the wagon were available in four-wheel drive. In this formation it was referred to as a Utility Wagon the ancestor of all sport utility vehicles. It was an all steel vehicle, with a fake woody body.

A Jeepster convertible, model designation VJ, was also available from 1948 to 1950, with rear-wheel drive only, and the 4-cylinder Go Devil Engine. VJ apparently doesn’t mean anything. Wikipedia tells us the Jeepster’s problems in the marketplace were due to its limited utility and practicality. It looks rugged and off-road capable, but is not. Appeal is limited due to the basic construction, poor all-weather protection, and the low performance when equipped with the I-4 engine. Even with an optional six-cylinder engine and offering the VJ3 version at a lower price, Jeepsters did not draw many new buyers. One specific factor turning potential buyers away, was the lack of roll-up door windows.

Brooks also designed Harley-Davidson motorcycles, especially the 1949 Hydra-Glide, one of his first, helping create the new suspension forks in the front, bucket headlight, and the streamlined design. All Harleys since, including models in production now, are based on Stevens’s body designs.

Stevens worked with branding. His most ambitious projects were an ongoing series of package and logo designs for Miller Brewing in the early 1950s. For 3M he developed a unified package design, that was applied to 35 000 different products, in 1965.

Stevens started working for Studebaker in 1961, when the company was already in a lot of trouble financially. The auto manufacturer closed its South Bend factory in December of 1963. Thus, the only Stevens designs that Studebaker produced were the 1962 and 1963 Larks and Hawks and the 1963 Wagonaire. He did create several prototypes for other cars, projecting designs for the company all the way out to 1968.

Stevens’ relationship with Outboard Marine Corporation (OMC) was always extremely strong, in part because he was a friend of Ralph Evinrude (1907 – 1986). Stevens’ staff designers, especially Gordon Kelly and John Bradley, executed hundreds of designs for OMC’s products: Evinrude and Johnson outboard motors, Lawn Boy mowers, Cushman scooters, carts and motorcycles.

Stevens acknowledged the fact that all of his designs were ephemeral. He envisioned good design as changing from year to year, to adapt to new technologies and new tastes. Planned obsolescence was

Words of the Year 2024

Some people call them fake Hermès Kelly bags, but replica or faux, sounds so much nicer. Whatever one wants to call them, this approach is less expensive than the real thing. None of my friends can tell the difference, so spending money on authenticity is wasted. The smaller bag is comparable to the original 25 cm, while the larger one is is 28 cm. The bag is named after Grace Kelly.

This is the fourth Words of the Year. Instead of focusing on a single word each month, this year the focus is on phrases, that have attracted my attention, in some way. Some of these came about listening to a music video, or reading a book when a phrase jumped out at me, and demanded my attention. Then again, some come from the usual sources…

Brain rot

Oxford’s expression of the year, Brain rot = the supposed deterioration of a person’s mental or intellectual state, especially viewed as the result of overconsumption of material (now particularly online content) considered to be trivial or unchallenging. The term gained prominence in 2024, capturing concerns about consuming low-quality online content, especially in excess and on social media. Its first recorded use was in American naturalist, essayist, poet, and philosopher Henry David Thoreau’s (1817 – 1862) Walden (1854), a reflection upon simple living in natural surroundings, and his essay Civil Disobedience = Resistance to Civil Government (original title).

Casper Grathwohl, Oxford Languages president, said: “Brain rot speaks to one of the perceived dangers of virtual life, and how we are using our free time. It feels like a rightful next chapter in the cultural conversation about humanity and technology.”

The five unsuccessful shortlisted words were: demure = reserved or responsible behaviour; dynamic pricing = a situation where the price of a product or service varies to reflect demand; lore = a body of facts and background information related to someone or something; romantasy = a fiction genre combining romance and fantasy; and slop = low-quality artificial intelligence generated online content.

The Power and the Glory

The Power and the Glory alludes to the doxology often recited at the end of the Lord’s Prayer: “For thine is the kingdom, the power, and the glory, forever and ever, amen.” is one example of many.

Graham Greene (1904 – 1991) has used this phrase as the title of a 1941 novel about a renegade Catholic ‘whisky priest’ living in Tabasco, Mexico in the 1930s, when the Mexican government was attempting to suppress the Catholic Church. It was initially published in the United States under the title The Labyrinthine Ways. A Wikipedia article about the novel explains the plot, and other details.

Whisky priest not only refers to Graham Greene’s unnamed protagonist, more generally it refers to a person or fictional character who shows clear signs of moral weakness while preaching to a higher standard. That preaching does not have to take place in a church.

Greene was part of the Catholic literary revival. It is a term that has been applied to a movement towards explicitly Catholic allegiance and themes among leading literary figures. These are generally converts to Catholicism. The other writers belonging to this movement that I have read extensively are G. K. Chesterton (1874 – 1936) and Norwegian Sigrid Undset (1882–1949) who translated Chesterton’s work, but also gave the world her best-known work, Kristin Lavransdatter, (a trilogy, 1920 – 1922) about medieval life in Norway from a woman’s perspective. Other prominent Norwegian writers who have also converted include 2023 Nobel literature prize winner Jon Fosse (1959 – ) and Karl Ove Knausgård (1968 – ).

Armed and Dangerous

Sometimes phrases are humorous while at other times they are deadly serious. Here is one example, looking at the not-so serious aspects first.

Armed and Dangerous might refer to Armed and Dangerous (original title: Вооружён и очень опасен = Vooruzhyon i ochen opasen, = Armed and Very Dangerous) the 1977 Soviet-Czech-Romanian western, based on the novel Gabriel Conroy (1875/ revised 1882) by Bret Harte (1836 – 1902). This movie was filmed completely in Russian, yet most of main characters were played by non-Russian speaking actors (Lithuanian, Romanian and Czech), who were dubbed by Soviet voice actors.

In 1986 another film appeared with the same title, minus Very. John Candy (1950 – 1984) had a major role in it. It opened to poor reviews and low ticket sales. After 38 years, I have still not found that this film has any redeeming qualities. In particular, it is not especially funny. My favourite John Candy film, watched many times over the years, often in classrooms, is Canadian Bacon (1995), where he portrays American Sheriff Bud Boomer, enthusiastically wanting to go to war against Canada. Candy was Canadian! These days, I think American President-elect Donald Tariff, must have been secretly watching the film, and Bud’s role, especially.

On from John Candy to Carl Canedy, producer of the Antrax thrash metal song Armed and Dangerous, released at the beginning of 1987. This is not the track I would recommend to get your mother to love thrash metal. At best it is mediocre. The defining song of that genre is Holy Wars … Punishment Due (1990) by Megadeth, with its intricate rhythm, memorable leads, bass and the meaning of its lyrics. Written in Northern Ireland, during the troubles. Some regard this the best thrash metal song ever! Unfortunately, there is no mention of Armed and Dangerous in it.

Wikipedia reminds us that philosophically, thrash metal developed as a backlash against both the conservatism of the Reagan era and the much more moderate, pop-influenced, and widely accessible heavy metal sub-genre of glam metal which also developed concurrently in the 1980s. Derived genres include crossover thrash, a fusion of thrash metal and hardcore punk.

The more serious side involves fungi, where the armed part is their ability to mutate. These are an ever present and real armed threat to humanity. There are many examples including: potato blight = Phytophthora infestans, infecting potatoes and tomatoes; black sigatoka = Mycosphaerella fijiensis, infecting bananas and plantain; witchweed = Striga hermonthica, infecting many species including corn, millet and grasses; rice blast = Magnaporthe oryzae, infecting grasses; Asian soybean rot = Phakopsora pachyrhizi, infecting soybeans and other legumes.

In the 1960s, stem rust led to the resistant wheat varieties that fueled a green revolution. Many farmers believed they were finished with the rust fungus Puccinia graminis. But in 1998, a dangerous new strain, Ug99, emerged in Uganda. By 2004, its spread prompted Norman Borlaug (1914 – 2009) an American agronomist and winner of the Nobel peace prize, to investigate it further. His research contributions led, indirectly, to extensive increases in agricultural production (the green revolution). Ug99 threatens wheat crops throughout the Africa, the Middle East and Asia. It has been found in Europe from Spain to Siberia. It’s presence increases the risk of famine in Pakistan, India and other locations where small farmers can’t afford fungicides.

Tarted up

Here is a phrase from the Free Dictionary, using tarted up: We tarted up the apartment with a pink shag carpet. The dancers tarted themselves up in feathers and sequins. I live with a greenish shag carpet. Pink was used extensively in my childhood living room, but not with a shag carpet. In yet another Guardian article it was pointed out that fashion brands mislabeled real feathers as faux. Wikipedia could tell me that sequins made with nautilus shell were found dating back 12 000 years in Indonesia. While not having any clothing that use feathers or sequins, I use pink extensively, and most recently ordered two more phone cases in the same pink I have used for the past two years. This is to ensure my Asus Zenfone 9 purchased in 2022, will last until I turn 80, in 2028. Currently (2024) there is a Zenfone 11. Zenfone 13 is a delightful designation, that I expect to emerge in 2026. By 2028, it should be sufficiently dated to attract a sizable price discount.

John Camden Hotten (1832 – 1873) in Dictionary of modern slang, cant, and vulgar words (1864) has this to say about tart: a term of approval applied by the London lower orders to a young woman for whom some affection is felt. The expression is not generally employed by the young men, unless the female is in ‘her best,’ with a coloured gown, red or blue shawl, and plenty of ribbons in her bonnet—in fact, made pretty all over, like the jam tarts in the swell bakers’ shops.

Pascal Tréguer (? – ) adds: The word [tart] therefore was originally a term of endearment, and what most probably happened was an ordinary semantic extension of tart, from the literal sense of a small open pastry case containing a sweet filling to the figurative sense of pastry case containing a sweet filling a sweet woman.

Loud Budgeting

For me, loud budgeting was always known, but nameless, before it received a name from Pass Notes Shops and Shopping in the Guardian newspaper. The term emerged in a viral post by TikToker Lukas Battle (1997 – ). My wife, Trish, does not follow anyone on TikTok. She tells me that when we spend money on almost anything, it is our scarcity brain taking control. Her focus now is on hoarding avoidance. This has not always been the case. Our goal during our first year (1980-1981) in high priced Norway was to be able to buy enough food. At the end of that first year, we had saved enough to buy a radio that also played tape cassettes, a luxury.

As an accountant, Trish was well aware that budgets have to be loud. Quiet budgets just encourage excessive spending. We have always been open about our economic situation, especially with our children. When one lives in relative poverty, indulging in quiet luxury is immoral. One is thankful one has a coat and a backpack. I am not sure either of us would recognize a Hermès Kelly bag (mentioned here, only because it was part of the Guardian article). For over forty years, we have rejected aspirational consumerism and have embraced thriftiness. Please do not ask about our VW ID.Buzz.

While the Hermès Kelly bag was popularized by Grace Kelly (1929 – 1982), it was just a simple, practical design. Kelly allegedly used it to hide her pregnancy in 1956. Yet, the bag was created in 1935 by Robert Dumas and called the Sac à Dépêches = dispatch bag, which in itself took design elements from the older Haut à Courroies (HAC) bag = top with belts bag, which was designed in the 1850s. There are website posts that explain details about this bag’s design, that allow/ encourage anyone to make their own DIY version. When doing so, please avoid using animal products.

Battle claims loud budgeting is being like rich people. They hate spending money. He encourages people to broadcast spending limits, and to be financially upfront with friends.

Then there are things I don’t understand, like putting deactivation stickers on credit cards. I only use a credit card to provide additional security when making some types of purchases, such as hotel accommodation in foreign countries. It had gone four years since a credit card was used! Except, before this text was revised, my bank sent me an email claiming that they would not renew my credit card, unless I used it. So, I used it for a purchase, then (for good measure) made a couple of other random purchases with it, while visiting Newfoundland.

Trish told me about one Norwegian woman who claimed that she was now leading a life of luxury. She was now able to eat nutritious meals every day! She was so rich that she could allow her children to make choices about what they wanted to do in their free time. I agree, that is what life should be, but I hope it becomes regarded as normal, rather than luxurious.

Battle is living in a vastly different world from the one I populate. In his world, people receive costly social invitations, that should be turned down, with honest reasons about why. He is following luxury brands on social media, that should be unfollowed. He is proudly packing his own lunch, and eating it in public. He claims that his actions are not communist, but contain the secret ingredient of capitalism: being mean. Being mean isn’t new, but it could be a novelty for Gen Z or Gen α. He wonders if they can survive financial catastrophe, by bringing their own coffee from home.

Coffee

My son, Alasdair, tells me that the greatest quality reduction in coffee from farm field to sipping/ drinking/ gulping the final product, comes in the grinding of its beans. He claims that one has only 15 minutes from grinding to filtering to produce coffee at its best. Thus, he is contemplating the purchase of a grinder. I might too, but the operative word here is might, because I have engaged in a purchasing habit that has lasted for 35+ years: The replacement of Coop Red = filtermalt, mørk og kraftig = filter ground, dark and strong coffee with unground coffee beans. I even know where the red coffee is located in our local store, but have no idea where coffee beans can be found.

Fortunately, I am better at comparison shopping online. I announced my finding of an Andersson CEG 1.0 grinder from Net on Net that uses 200 W to produce up to 70 g of coffee. It has a 3.7 (of five) star rating and costs NOK 200 now and during Black week (delivered). Normal price appears to be about NOK 120 – 140, so I suggested we wait until the sales start in the new year before purchasing it. I could also tell Alasdair that the most popular grinder with variable grain size settings, is a Wilfa WSCG2 with a 160 W motor that can produce 130 g of coffee. It costs NOK 740 and has a rating of 3.8. To obtain a higher rating (4.2) the price increases to NOK 4 000.

Unfortunately, my son knows I waste money buying inappropriate tools, then regret my decisions and buy something more appropriate. He suggested that I should watch some coffee grinder reviews on YouTube, that might explain the price difference. I watched three short, but informative, reviews before admitting that I would reconsider my choice. Conclusion: I will live without a coffee grinder, and continue to use Coop red coffee.

Overnight Sensations

Despite resolve to focus on phrases, it is sometimes necessary to understand some new words that are entering a language, are influencing these phrases. These typically come from younger generations, even beyond that of one’s children. Now that I have retired from teaching, I have very little input from anyone described as Gen Z or Gen α.

Anna Spanish‘s founder, Anna Latorra, provides some contemporary slang, noting its “power to turn phrases into overnight sensations.” Some of the more recent terms, with their urban dictionary meanings are listed below.

TermMeaning
PookieA nickname you call your best friend or someone you really love
GyatShort term for god damn
SimpIt is when a man is overly submissive to a woman and gains nothing from it. Example: “Guys simp in her Instagram replies and she doesn’t even notice them.”
Rizz“Rizz” comes from the word “charisma.” In southern Baltimore they’ve started shortening it, using “rizzma” (the noun replacing “charisma”) and to “rizz” (the action of showing charisma)
Coquette“Coquette” is mainly an aesthetic based on reclaiming girlhood and embracing a fun-loving, bubbly personality
PreppyA “preppy” girl is a girl who wears the “preppy aesthetic” style, this includes wearing Roller Rabbit, Love Shack Fancy, Sassy Shortcake, American Eagle, etc.
YeetTo violently throw an object that you deem to be worthless, inferior or just plain garbage
NPCShort for non-playable character, it means the opposite of a main character. This person is usually a background character in your life that doesn’t have significant importance
MootsShort for “mutuals.” It’s when you follow someone on social media and they follow you back
No cap/cappingThe use of the phrase “no cap” is meant to convey authenticity and truth. Example: “No cap, ‘Barbie’ is the movie of the year.”
IckSomething someone does that is an instant turn-off for you, making you instantly hate the idea of being with them romantically. Example: “His cargo shorts gave me the ick.”
GRWMA “GRWM” video is a vlog where you film everything that you do in your morning, night, ECT routine. Acronym for “Get Ready With Me.”
DeluluA delusional fan girl/boy who believes they can/will end up with their favourite idol or celebrity and invest an unhealthy amount of time and energy into said idol
CheugyThe opposite of trendy
BussinWhat you would say if something was really good
OppsAnyone in competition or against you. Enemies
SusGiving the impression that something is questionable or dishonest, short for suspicious
PFPShort for profile picture
OOMFShort for “one of my followers”, usually used on X and TikTok to talk about one of your followers without mentioning their name
Beige flagSomething that’s neither good nor bad but makes you pause for a minute when you notice it and then you just continue on, something odd. Similar to “red flag” (a bad sign) and “green flag” (a good sign)
SheeshA word used as a substitute to “Daaaamn!”
OK BoomerA slang term used as a response to someone from the Baby Boomer generation. Example: Boomer: “When I was your age I already owned a home.” Gen Z: “OK Boomer. Houses cost like $12,000 back then.”
HeatherWhen someone says that you’re “Heather,” they mean that everybody can’t help but like you
MidUsed to insult or degrade something you don’t like, labeling it as average or poor quality. Example: “Personally, I thought ‘Barbie’ was mid.”

I discovered Argentinian Caro(lina) Kowanz, currently living in Germany, on YouTube, where she provides short videos explaining English. She portrays different roles in the same video. In one, for example, she was a passenger on an aircraft who faced assorted problems, including a seat that would not recline, and an air vent that couldn’t be closed. She was also the flight attendant, acknowledged these problems, but unable to resolve them.

On another video, she tried to help viewers understand these English idioms. I have not used any of these idioms.

Ghosting = an abrupt cessation of communication with another person. Often involving dating.

Throw shade = indirect criticism of someone.

On fleek = something looks perfect! Yes, this is positive.

Spill the tea = tell the gossip, especially with all the drama that goes with it.

Low-key = something that you don’t want to make an issue about, in contrast to high-key where one feels very strongly about an issue.

Binge-watch = watching many episodes of a series without taking a break.

Lit = something is amazing or very exciting.

Slide into someone’s dms = send a direct message to someone on social media.

No cap = honestly.

Bet = Right or correct.

That fit goes hard = That is a very nice outfit you are wearing.

Then on Christmas Eve, 2024-12-24, in a Guardian newspaper article, I came across three Chinese neologisms about China’s current predicament – slowing economic growth, a falling birthrate, a meagre social safety net, and increasing isolation.

躺平 = Tangping = lying flat, a term used to describe the young generation of Chinese who are choosing to chill out rather than hustle in China’s high-pressure economy.

润学 ? =Runxue = run philosophy, which refers to the determination of large numbers of people to emigrate.

对合 = Neijuan = rolling inwards = involution, a term used to describe the feeling of diminishing returns in China’s social contract. This is a concept from sociology that refers to a society that can no longer evolve, no matter how hard it tries. Applied to the individual, it means that no matter how hard someone works, progress is impossible.

Software: An Update

For me, it is always difficult to know how to illustrate software. I had thought of putting a photo of an IBM punch card, or even key punch machine, used to produce the finished cards, which was the first way I used to input data. These cards were then put in sequential order, before being handed in to a punch card operator. At some point they were fed into and read by a card reader. Since it could go up to two hours before one would be given a print out of the results, one had to work on two or three different projects to make optimal use of time. Then I remembered that my use of punch cards was over 50 years ago, before smartphones, laptops, even PCs. Even in the 1970s, I was using a keyboard to input data. I searched through Unsplash until I found this illustration showing software in 2024. It was made by Dadi Prayoga, from Indonesia, who uploaded onto Unsplash 2024-12-17.

I ain’t perfec, and neither is software. One of the challenges people face is how they frame software, and its imperfections. In some conversations, it seems that users ascribe religious attributes to it. They worship the technology, are willing to forgive some of it sins and sinners, but condemn other sins, and crucify these sinners. This is true of operating systems (OS), programming languages, web browsers, almost every form of software. I attempt to be a software agnostic.

Ideally, what I want to do in this post, is explain how competent computer scientists/ software engineers approach problems. To begin with, it would be useful/ appropriate to look at how they learn to be professionals. After an initial programming course, the next step is often a course in algorithms and data structures. Students learn how to frame problems in terms of these key elements. After one gains a basic understanding of these, the student is expected to learn, in depth, about how data is organized, and how programs are written. One of the answers to both is, in a database. There are many different types of database systems. There are also courses about computer hardware, operating systems, and system design.

Within a software engineering education, there are certain people and texts that have come to prominence. These are examined critically. In this post several of them will be mentioned.

Ted Codd (1923 – 2003) expressed database theory in thirteen rules, numbered from 0 to 12. These defined database management system requirements, for a database to be considered relational. Yes, computing professionals have abandoned non-relational databases. In 2024, they only work with relational data bases! These rules were presented in 1970, and updated in 1974. In 2024-12, the most popular database engines are: Oracle Database, MySQL, Microsoft SQL Server, PostgreSQL, Snowflake, IBM Db2, SQLite, Microsoft Access, Databricks and MariaDB.

Relational databases store data in tables. A database can consist of several/ many tables. A table has columns, with the same type of data stored in each column. Each record forms a row. A primary key is a field in a table used to identify that record. The point here, is that the primary key has to be unique.

One of the most important rules is to avoid redundancy. Duplicate information in a database schema = the structure of a database described in a formal language, can lead to inconsistencies. If the same data is stored in multiple tables, there is a risk that it will be updated in one table but not in others, resulting in discrepancies, which is a polite term for errors.

Open Source

Generally, I use the term open-source to describe software that is designed for anyone to examine, use, modify and distribute as they see fit. It is often developed collaboratively, by corporate partners or groups of people, who may be professionals, or talented amateurs. There are also examples of a single person developing all of the code for a project/ app. Other people want to emphasize that it is software that can be obtained without cost, so they refer to it as free and open source software (FOSS).

It is inappropriate to abbreviate open-source as OS, because that is the common, accepted abbreviation for operating system. So today, FOSS is most often used.

At one time FOSS was only advocated by long-haired, un-showered, left-leaning hippies. That was a long time ago. Now mega-corporations have discovered that using FOSS programs can reduce their costs significantly, and increase profits they can give to shareholders. The main idea is to make something appropriate once, then let everyone use it.

Yet the world of FOSS is changing in other ways. Simple programs are evolving into multi-task commodities (yes, that is the term often used) not to solve problems in a better way, but to make overarching products. Before, it was more common for a program to solve a limited task, appropriately. Unfortunately, this increased scope frequently results in badly designed programs that function poorly in practice. In addition, security and related issues arise, since solutions suitable for a product with limited capabilities, may be unsuitable and with multiple issues, that are increasingly overlooked, with an expanded product.

The FOSS ecosystem has contended with problems of scale and complexity at the same time FOSS has become increasingly important. In server environments, especially, there is no need for multiple vendors to produce competing products. Thus, they often work together to make a product that suits all of their needs.

Software is difficult, and it can make life more difficult because some decision makers do not understand how to evaluate it. Instead, they think that purchasing something can be a quick way of solving their challenges.

Software complexity and growth rate have exasperated traditional open-source governance models. New approaches are needed, but they also need to be evaluated to determine which offer improvements over existing systems.

FOSS participation is declining, in terms of funding, headcount and other metrics. Some for-profit organizations are invading FOSS domains. They offer free software that solves some challenges, but then sell paid solutions, for anything beyond that minimum. This is often referred to as freemium.

Meanwhile, cyber threats keep evolving, with many open-source projects becoming targets for malicious activity. Users are more vulnerable, exposed to more vectors = ways of attacking, than ever before. Attackers are big, smart, nimble yet patient. This leads to more intricate strategies, resembling games. When attacks occur, they give greater rewards to the perpetrators. Attacks provide not just economic, but political, profits. It is often claimed that the North Korean and Russian state are involved in some of these.

The increasing storage of data in clouds = someone else’s server, and the increased use of Software as a Service (SaaS) creates new challenges. This effectively means that what amounted to cultural diversity is being replaced by layers of technical and organizational monocultures that may enable attacks.

Users can assist with FOSS. Those with programming and other technical skills can contribute code. Those with user experience, can review products. Everyone can use FOSS.

Meanwhile, organizations have reduced their computer-system expertise. They shift capital expenditures to operating expenses, and depend on cloud vendors for security. This is delusional, as these cloud vendors do not guarantee any security, apart from some best-effort propaganda. Similarly, when organizations use off-the-shelf hardware and software solutions, these are often a mix of FOSS and commercial software, that create elaborate attack surfaces, yet whose components and interactions are accessible and well understood. This allows attackers to hide in the open.

Operating systems

The most fundamental type of software is the operating system (OS). A device has to have one in order to run other software. Different operating systems are required for different purposes, so people are expected to know how to use these different operating systems on their phone/ tablet, laptop/ desktop, media player and server. There are also specialty operating systems for real-time systems, including robotics.

It is often stated that people are influenced by the history of devices that they use. The first computer I used was an International Business Machines System /360. I am unsure which model, or even the type of operating system. There were several available: Basic Operating System/360 (BOS/360), Tape Operating System (TOS/360), and Disk Operating System/360 (DOS/360). I was using it to learn SPSS = Statistical Package for the Social Sciences. Then I took a course in programming, based on PL/1 = Programming Language One, from IBM.

After that I used a number of different mini-machines. First there was a Hewlett-Packard in the 2100 series, probably a 2100S, with the “Real Time Executive” operating system, (RTE). In Molde, in Norway, I began using a Digital Equipment Corporation (DEC) VAX-11/ 780, with a VAX/VMS operating system. I became very familiar with this machine, and used it extensively for many different purposes, including the construction of data base systems as well as simulation models. When we moved to Bodø in 1985, Norsk Data NORD-500 machines were used, running the Sintran III operating system.

When I took a position in Steinkjer in 1988, they were in the process of migrating from Norsk Data to Digital Equipment. I saw first hand how inappropriate purchasing decisions could be made. First, the college was told about the PRISM project to produce updated VAX machines, accompanied by the MICA project, which intended to consolidate VMS and ULTRIX into a single operating system. The school believed this was actually happening. They ordered workstations of type DECstation 3100, specifically designed and built to run a UNIX system, ULTRIX. They then discovered that no version of the VMS operating system would be released for the DECstations. They would not integrate with the other VAX utstyr, they had also ordered. The key takeaway is never to believe computer sales people. There is just too much vaporware that never emerges!

Fortunately, the Norwegian government had another mission, to bail out the faltering Norsk Data. They provided colleges with free money to buy lots of Nord computers to save the company from bankruptcy. So more expensive computers were bought. The Sintran operating system was modified so that it could mimic UNIX.

None of the above computers have any relevance today. This is the way of technology. For almost sixty years I used ICE = internal combustion engine vehicles. Then I switched to an electric vehicle. I have no intention of going back. All of those years of gearing with a manual transmission are history. Similarly, my keypunch experience has no relevance in terms of computing today. Surprisingly, my touch-typing skills learned starting in ca. 1963, are fully relevant today.

UNIX

Modern computing begins with Unix developed at Bell Labs research centre in New Jersey in 1969. A consensus was emerging in the computer world that having a handful of companies each producing a computer with its own OS, was counter productive. It just took time for an appropriate OS to emerge. There have come out some important books about computing that point towards open standards.

There were books that attempted to expand systems thinking beyond computers, two of which were written by Donella Meadows (1941 – 2001). These were Limits to Growth (1972) and Thinking in Systems: A Primer (1993/ restructured by others 2008). The major problems of the world = war/ hunger/ poverty/ environmental degradation are due to system failure. They cannot be solved piecemeal by fixing any one piece in isolation. This lesson has not been learned.

In terms of computing the most influential book of my list is Tracy Kidder’s (1945 – ) The Soul of a New Machine (1981). It explains the drama/ comedy/ excitement/ boredom/ exploitation of workers by a company bringing a new microcomputer to the mass market. This book explains a common attitude of start-ups, a total unconcern about the physical and mental health of employees, who are regarded as replaceable cogs in a machine.

Andrew S. Tanenbaum (1944 – ) wrote Operating Systems: Design and Implementation (1987) with Albert S. Woodhull. This book introduced me to operating system principles. While the source code for MINIX was included as part of the book, my own computer science education required me to construct my own operating system as a semester project. In addition, I also had to implement a compiler for a language resembling Pascal (1970), developed by Niklaus Wirth (1934 – 2024).

At this point, one should probably mention Wirth’s law, which is an adage = a saying that sets forth a general truth. This one states that software is getting slower more rapidly than hardware is becoming faster. Wirth discussed it in A Plea for Lean Software (1995). Pascal was based on Algol 60 = Algorithmic Language (1960). In Norway, this formed the starting point for Simula = Simulation Language, developed in Oslo, and often called the first object oriented language.

Back at Bell Labs, Brian Ritchie (1941 – 2011) created the C programming language, and with Ken Thompson (1943 – ) developed the Unix operating system. Sometimes inventors are not the best people to explain their inventions, so it was fortunate for these two that Canadian computer scientist Brian Kernighan (1942 – ), was better able to write about Unix and the C programming language.

I have learned several additional languages including: Objective C (1980) developed by Brad Cox (1944 – 2021) and Tom Love (? – ?), Smalltalk (1972) designed by Alan Kay (1940 – ), Dan Ingalls (1945 – ), Adele Goldberg (1944 – ) and others, at Xerox PARC = Palo Alto Research Center; and Prolog (1972) by Alain Colmerauer (1941 – 2017).

PCs

By 1985, almost everyone of a certain age with technical interests had discovered personal computers. One of the schools where I was working had ordered a computer lab worth of Apple II clones. Yet no one knew how to use them! I paid for our first family computer by teaching an introductory computing course to the teachers. The local Amiga distributor invoiced the school for the course, and in time we received an Amiga 2000.

At about the same time IBM compatible personal computers were becoming more popular, and encroaching on the Apple II market. By 1984 Apple had seen the writing on the wall, and had produced a more advanced Macintosh computer. At the time, both Macs and PCs were too expensive for secondary schools. The operating systems in IBM and other personal computers were referred to as PC- and MS-DOS, respectively. They were both renamed variants of 86-DOS, owned by Seattle Computer Products, written by Tim Paterson (1956 – ). Its development took six weeks. It was a clone of Digital Research’s CP/M used on Apple II machines and clones, ported to run on 8086 processors. In addition there was improved disk buffering and a new file system.

Windows

Microsoft released Windows 1.0 on 1985-11-20, as a graphical operating system shell for MS-DOS in response to the growing interest in graphical user interfaces (GUIs). To begin with, this was just for people with special interests. Windows took off in 1990, with the release of version 3.0.

With the exception of an iMac, used to teach students studying media and communications, the work computers I have used ran various version of Windows. XP is the version I liked best, followed by 7. I was assigned 8 when I retired.

The latest variant, Windows 11 was released 2021-10-05. For the past several months, Microsoft has been encouraging a transition from Windows 10 to 11. At the same time, they have said that hardware will have to meet specific capabilities in order to allowed to upgrade. End of support for Windows 10 is scheduled for 2025-10-14. Except, one can pay for a one year delay, perhaps longer.

Shortly after Windows 11 came out, Microsoft and Intel found compatibility issues with Intel Smart Sound Technology (Intel SST) on 11th Gen Core processors running Windows 11. Microsoft applied a compatibility hold on affected systems, preventing those PCs from upgrading to Windows 11. Now, over two years later, Microsoft has resolved the issue and cleared a path for those computers to run Windows 11.

Windows 11 has minimum system requirements. An attempt to install it on other PCs will result in the installer refusing. There are unofficial ways to bypass this, but they are not for the faint at heart!

Users of Windows 7, 8, 10 or 11 may want to use Winaero Tweaker, a freeware app created by Sergey Tkachenko. It is an all-in-one application that comes with dozens of options for fine-grained tuning of various Windows settings and features. To encourage Windows users to consider Tweaker, I have listed several categories of its features here: Shortcut tools; To launch an app as Administrator without a UAC confirmation; To directly open any Control Panel applet or system folder; To directly open any Settings page; To create shortcuts to the classic Shut Down Windows dialog (Alt+F4), and Safe Mode; To remove or customize the shortcut arrow overlay icon; To remove the ” – shortcut” suffix; To remove blue arrows from compressed files. Manage Windows apps and features: Restore classic Windows Photos Viewer to use it instead of Photos; Restore the classic sound volume pop-up slider. Permanently disable Windows Telemetry and Data Collection; Permanently disable Windows Defender; Permanently disable Windows Update; Disable ads and unwanted app installation (Candy Crush Soda Saga, etc); Enable the built-in Administrator account. Enable automatic Registry backup; Change drag-n-drop sensitivity; Disable Action Center and notifications; Reset icon cache; Reset all Group Policy options at once. Networking options: Change the remote desktop port. Make mapped drives accessible for elevated apps; Tune up Windows appearance; Customize folders in This PC; Customize entries in Navigation Pane in File Explorer (in the left pane); Rename and change the icon for the Quick Access entry; Increase the taskbar transparency level; Show time seconds in the taskbar clock; Disable blur for the sign-in screen; Customize fonts, and the Alt+Tab dialog appearance; Change the title bar color for inactive windows; Add handy context menus using a huge set of presets, e.g. to switch a Power Plan with one click, open a Command Prompt, add a Settings cascading menu – plenty of them; Hide default entries from the context menu, e.g. Edit with Photos, Edit with Paint 3D, etc.; Add ‘Run as Administrator’ to VBS, MSI, CMD and BAT files; Change the default app for the Edit context menu entry for images.

Mac OS

System Software = Mac OS = Classic Mac OS ( retronym) was a series of operating systems developed for the Macintosh family of personal computers by Apple Computer, from 1984 = System 1 to 2001 = Mac OS 9. It is credited with popularizing graphical user interfaces.

It was based on concepts from the Xerox PARC Alto computer. The Alto was conceived in 1972, in a memo written by Butler Lampson (1943 – ). It was inspired by the Stanford NLS = oN-Line System, developed by Douglas Engelbart (1925 – 2013), released in 1968-12-09. The Alto OS was made by Chuck Thacker (1943 – 2017). This was the first computer that used a mouse-driven graphical user interface (GUI), Mac OS consisted of a Macintosh Toolbox ROM and a System Folder = a set of files loaded from a disk.

The Berkeley Software Distribution (BSD) was a Unix operating system developed and distributed by the Computer Systems Research Group (CSRG) at the University of California, Berkeley beginning in 1978. It began as an improved derivative of AT&T’s original Unix that was developed at Bell Labs, based on the source code that, over time, was replaced by its own code.

After Steve Jobs (1955 – 2011) was fired as CEO of Apple in 1985, he started NeXT, and developed the NextStep, an object-oriented, multitasking operating system based on the Mach project at Carnegie Mellon University in Pittsburgh, that ran from 1985 to 1994, The end results was the Mach 3.0 microkernel. It was developed as a replacement for the kernel in BSD.

OpenStep was an object-oriented application programming interface (API) specification developed by NeXT. It was written in Objective C, and released 1994-10-19, providing a framework for building graphical user interfaces (GUIs) and for developing software applications. OpenStep was designed to be platform-independent, allowing developers to write code that could run on multiple operating systems.

By 1997 Steve Jobs was back in Apple’s good graces, and had become Apple’s CEO once again. In 1996, Apple Computer had acquired NeXT. Apple merged the user interface/ environment from classic Mac OS, with NeXTSTEP and OpenStep to create Mac OS X. All of Apple’s subsequent platforms since iPhone OS 1 were then based on Mac OS X. This was later renamed macOS.

Linux

Wikipedia tells us that Linux is a family of open-source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on 1991-09-17, by Linus Torvalds (1969 – ). Linux is typically packaged as a Linux distribution (distro), which includes the kernel and supporting system software and libraries—most of which are provided by third parties—to create a complete operating system, designed as a clone of Unix and released under the copyleft GPL license.

There is a lot of hype from computer manufacturers to use Microsoft Windows operating systems. Having consumers buy it, undoubtedly adds to their profits, especially when it is almost impossible to buy a low-priced computer without Windows.

Microsoft is currently requiring people with older hardware, that runs on Windows 10, to buy new, computer hardware that is compatible with the Windows 11 requirements. This can be expensive. Another approach is for Windows 10 users in this situation, is for them to try a user friendly version of Linux to see if they feel comfortable with it. One such OS is the latest version of Linux Mint with the Cinnamon desktop.

A trial version can be explored by making a live version, which means copying a bootable version of Linux Mint onto a USB flash drive/ memory stick/ thumb drive. By booting up from this drive, Linux will be available. Those who, after this trial, feel uncomfortable using Linux do not have to do anything, except to avoid booting up from the USB drive again. Those that find they prefer Linux can, at some point, install it on their machine, either alone or as part of a dual-boot system with their original OS. The memory stick can then be used to boot Linux on other computers. Linux is particularly well suited for older hardware.

One of the main characteristics of an operating system is its file system or, in the case of Android, the lack of such a system, as it is up to each app to decide how its data is to be organized. At Cliff Cottage, we use a variety of file systems. On most machines, we use ZFS, previously known as Zettabyte File System. It was originally released in 2006. There are some challenges with it, or more correctly with TimeShift, a program that allows users to reset their hardware, after updates, to a prior point in time. TimeShift does not work with ZFS.

On some other machines there is btrfs = better/ butter file system,

Spyware

The content here began after Jeffrey Paul, hacker and security researcher living in Berlin, wrote, “… that in the current version of the macOS, the OS sends to Apple a hash = unique identifier, of each and every program you run, when you run it. Lots of people didn’t realize this, because it’s silent and invisible and it fails instantly and gracefully when you’re offline…. Because it does this using the internet, the server sees your IP [address], of course, and knows what time the request came in. An IP address allows for coarse, city-level and ISP-level geolocation, and allows for a table that has the following headings: Date, Time, Computer, ISP, City, State, Application Hash…. This means that Apple knows when you’re at home. When you’re at work. What apps you open there, and how often.”

This is not a unique problem for mac users. There are similar challenges with trackers on other operating systems. Paul continues, “Who cares?” I hear you asking. Well, it’s not just Apple. This information doesn’t stay with them…. requests are transmitted unencrypted. Everyone who can see the network can see these … This data amounts to a tremendous trove of data about your life and habits, and allows someone possessing all of it to identify your movement and activity patterns. For some people, this can even pose a physical danger to them. Now, it’s been possible up until today to block this sort of stuff on your Mac [with Little Snitch]…. The version of macOS that was released today [well, over four years ago: 2020-11-12], 11.0, also known as Big Sur, … prevent[s] Little Snitch from working the same way.”

Fortunately, this issue was resolved when Little Snitch 5 was launched. That said, people should be concerned about what their operating system is doing. Personally, living in Norway, I am less concerned about big government watching my activities, than I am of big business. This is the main reason why I encourage people to use open-source software, when possible. People living in anything approaching an authoritarian regime have to be vigilant. The content presented here is suitable for politically active people living in full democracies, where such activism is not a life threatening activity.

In each their own way, corporations, such as Alphabet (owners of Google), Amazon, Apple and Meta (owners of Facebook) dominate vast zones of the world wide web, with multiple trackers following (or at least attempting to follow) everyone’s movements through cyberspace. YouTube, an Alphabet subsidiary, can suggest new videos based on each individual’s entire cyber footprint, and not just visits to YouTube. Facebook is even worse, by asking rhetorical question about whether one knows Josephine Doe. Of course the user does, otherwise they wouldn’t ask! This situation arises despite every web browser in use being equipped with the latest in ad-blocking and anti-tracking software. Why? Well, the Corporatedom doesn’t respect privacy rights, necessitating this web-log post.

Personal comments

Our laptops continue to use Linux Mint, but upgraded to version 22. While we attempted to use a dual boot system with Windows 11, this proved impossible. So we upgraded our file system to ZFS – which is what we use on our server.

We intend to install Windows either on a dedicated machine, or as a virtual machine, at some unspecified point in the future. There are certain products that refuse to work under any other system than Windows. We have two: a library catelogue system, and a slide scanner. The system requirements for these are flexible. They will work with almost any version of windows, including outgoing 10, current 11, outdated 7 or ancient XP.

For sentimental reasons, and its connection with Mandrake Linux I would like to have one machine running the latest version of Mageia. Mandrake can be considered its forerunner, but with other intermediate steps. Mandrake was released in 1998, by Gaël Duval, mentioned above, to provide an easy-to-use Linux. In 2005-04, Mandrakesoft (as the company was then called) acquired Conectiva, a Brazilian-based company that produced a Linux distribution for Portuguese- and Spanish-speaking Latin America. As a result of this acquisition and a legal dispute with Hearst Corporation about the name Mandrake, the company changed its name to Mandriva. Mageia was formed as a fork of Mandriva in 2010. This occurred as the Mandriva subsidiary responsible for it, was declared bankrupt, and its assets liquidated. Mageia Version 1 was launched in 2011. Version 9 was launched in 2023-08. It still has not been installed on any of my machines.

Android & iOS

Android started in 2003 by Android, Inc., in Palo Alto, California. It was purchased by Google in 2005. Android 1.0 was released 2008-10. Unfortunately, a smartphone is nothing without its apps. Android software development is the process by which applications are created for devices running the Android operating system. Android apps can be written using Kotlin, Java, and C++ languages using the Android software development kit (SDK), while using other languages is also possible. However, all non-Java virtual machine (JVM) languages, such as Go, JavaScript, C, C++ or assembly, need the help of JVM language code, that may be supplied by tools, likely with restricted API support. Some programming languages and tools allow cross-platform app support for both Android and iOS.

iOS (formerly iPhone OS until version 4) is a mobile operating system developed by Apple exclusively for its mobile devices. The first-generation iPhone, was launched 2007-06-29. Major versions of iOS are released annually; the current stable version, iOS 18, released 2024-09-16.

It is the operating system that powers many of the company’s mobile devices, including the iPhone, and is the basis for three other operating systems made by Apple: iPadOS, tvOS, and watchOS. iOS formerly also powered iPads until iPadOS was introduced in 2019, and the iPod Touch line of devices until its discontinuation. iOS is the world’s second most widely installed mobile operating system, after Android.

Our hand-held devices continue to use Android as their primary OS. Android V13 is the operating system used on our Asus Zenfone 9 smartphones. Our son Alasdair uses this on his Samsung Galaxy, and our daughter Shelagh uses this on her Google Pixel. I never expect to be as comfortable using a hand-held device, as a computer. However, I do manage to use it, and the installed apps.

Hand-held devices (HHD): While some people refers to these as phones, telephony is only one use. For some, a smartphone or tablet is the only computing device used. Android may be open-source, but it also has a lot of proprietary input from Google and phone manufacturers. Personally, I kept threatening to go over to /e/, an e.foundation variant of open source Lineage OS. This would have been easy with a Fairphone. However, quality issues with that device are keeping me in the Android camp, for some additional years.

While I wrote about using a de-googlized Android OS from the e foundation, developed by Gaël Duvait, it has not been installed. I intend to try it out on one of the retired Xiaomi Pocophone F1s some time in 2025.

Apple also makes smart phones, tablets and other devices that need an OS. For larger computers, Apple offers MacOS. For other products, their solution is iOS for iPhones, iPadOS for iPads, watchOS for Apple Watches, and tvOS for Apple TVs. Each system is designed to optimize the user experience on its respective device.

Sometimes there are other factors than age that prompt a hardware change. The fact that we have an electric vehicle (EV), where much of its control is based on the use of hand-held devices, prompted us to buy new ones some months in advance of the EV’s arrival. We opted for two identical Asus Zenfone 9 devices. Our children have opted for Apple, Pixel and Samsung devices,

Laptop/ Desktop: The lifetime of a laptop computer varies with the individual using it. Take one resident user at Cliff Cottage, Trish. She used one laptop for over 7 years without any major issues. At one point her hard-disk was replaced because it was full, and she was encouraged to install one with a larger capacity. In contrast, another resident user, Brock, has managed to go through four different computers in the same time-period. He has multiple lame excuses to explain why none of these worked out.

Many users find that the main problem with Windows is not the lack of software, but excessive quantities of bloatware = Software with large, often unnecessary features, including some that spy on users. These features can make a machine slow/ unwieldy/ inefficiently, using excessive amounts of memory or disk space.

Because people differ in their needs, it is difficult to list all of the user programs/ software/ applications/ apps people need on their personal computers. Many of the most important programs come installed with the operating system. Additional programs can be installed using the Software Manager. To find additional/ useful/ important open-source programs, Alternative to, allows people to compare programs that may be useful.

Tails

The most basic way to achieve more privacy is to select an appropriate operating system, such as Tails. Tails is a security-focused Debian-based Linux distribution aimed at preserving privacy and anonymity against surveillance. It can run on almost any computer from a USB stick.

A live system contains a complete, bootable operating system, usually with writable storage that allows for customization, including the installation of software packages. They can save settings, and can be used for system administration, data recovery, or test purposes.

Tails lets users be anonymous on the Internet,  and helps circumvent censorship. It does so by forcing all connections to the Internet to go through the Tor network. It leaves no trace on the computer unless explicitly ordered to do so by the user. It uses state-of-the-art cryptographic tools to encrypt your files, emails and instant messaging.

Despite being the best, and simplest solution, most users are reluctant to use Tails. They deliberately select a second best solution, because they are unwilling to set themselves into a new operating system, slightly different from the one they are currently using. Yes, the writer of this post is also in precisely the same situation.

Operating System Conclusion

A lot of ink has been used here to comment on operating systems. An OS is important because they are at the heart of computing. Yet, they also bring forward a lot of emotion. In some respects they are like brands of cars. People may argue that a particular brand is better than another, but many people would set aside these prejudices if they were confronted with a choice between no car, and a car of their least favourite brand. Most people would prefer to have a computer/ smartphone with the worst possible OS, than not to have one at all.

Similarly, I don’t think anyone really wants an old OS version. They are just not appropriate to meet user needs. This means that Windows XP is the earliest OS that should ever be considered. Windows 7 is probably a better choice. The latest version of Linux Mint, is probably more appropriate than either of these two, but there can be cases where one needs a Windows operating system. In our household there are two: for a library system, and to digitize slides.

Web Browsers

Most well-known browsers are controlled by these large corporations: Apple offers Safari; Google has Chrome; Microsoft has Edge, and previously provided Explorer. Mozilla Firefox is an exception. It is not owned by big tech. However, Google has provided financing. For most people, Firefox is gudenuf to use.

Here at Cliff Cottage we use Firefox on all of our machines, but also have Brave, as an alternative. Brave provides shields that automatically block ads and trackers. This means that Brave loads sites faster than most other browsers. Shields also offer additional protection by blocking malicious webscripts. When possible, HTTP connections are automatically forced to become HTTPS connections. This is because HTTPS sites use a secure encryption protocol to protect user traffic, whereas HTTP is unsecured.

Browsers needs to be able to protect users from: Phishing sites, deceptive websites that mimic legitimate websites in an attempt to get users to provide personal information; web trackers/ cookies, internet scripts that follow users browsing habits from site to site, sharing browsing data with hackers, advertisers and others; spyware/ adware, embedded malware, usually in a browser, that captures data and/ or redirects searches to unwanted sites; keyloggers are malware used to take screenshots of computer content or to harvest keystrokes; malicious ads that direct users to unsafe sites. There are other behaviours, but those are the main ones in 2024.

Android & iPhone Browsers

Firefox Focus is an open source browser from Mozilla, available for Android and iOS smartphones and tablets. It improves browsing speed and protects users’ privacy by blocking online trackers, including third-party advertising. It checks all URLs against Google’s Safe Browsing service to help people avoid fraudulent sites. Firefox Klar is a modified flavour of Firefox Focus, with telemetry disabled. It was initially released for German-speaking countries. For those with serious privacy concerns this is the only flavor to use.

One of the challenges with Brave is that the features differ, depending on the type of device used. For example, in 2021 iOS users – but not Android users – got access to Brave Playlist, which allows them to save media such as music and videos to playlists. So, instead of bookmarking a song or video in Brave, users can view/ hear content offline (for an unknown to me, but limited period). Brave saves content in a format that’s only compatible with the Brave Media Player and can only appear on the device you saved it on. It cannot be downloaded to other devices.

Chromebook Browsers

Many Chromebook users will also want to consider Firefox Focus, running Android apps. While Crostini Chromebooks allow the use of Linux apps, these features are still being developed, and there are challenges. For most systems, the Linux sound system doesn’t work, either on media players or the Linux version of Firefox ESR. Some programs, like LibreOffice, work well, others don’t work at all. In general, Linux programs can’t recognize USB devices, which can be a problem is content has to be stored. In general, there are workarounds for this problem.

Norwegian Health Platform

The above green monochrome photograph is attractive. So far, that’s the only positive comment I can make about this common patient record system for Middle-Norway, where I live. Photo: Helseplatformen.

For some months, I have been looking at a specific problem, and wondering how the political authorities thought they could find success buying an inappropriate off-the-shelf software solution, in this case a medical record database.

It is a software problem that is affecting everyone in Norway. The software in question is an overarching patient journal/ record system referred to as the Health Platform, intended to be used by all Norwegian hospitals, medical centres and everyone else in Norway involved in the care of patients. Its use will start in Trøndelag.

In the evaluation of any software product, it is important that at least some of the decision makers consult with people competent to evaluate it. For example, with a database, it is important to know what that primary key will be. I suspect that the person/ people deciding to buy this system, did not look at the primary keys, or anything else, from a computer science/ engineering perspective. I further suspect, that these decision makers may have been bean counters, who are used to dealing with legumes rather than sick human beans.

There have been situations reported in the press about this patient journal. Often it involves a situation where test results to confirm the presence of cancer, have just disappeared, although sometimes they emerge months later. This is where I began to suspect inappropriate primary keys. One should be able to search for records having a specific primary key, a person’s identification number, and find everything about that person, that is in the database.

We have one friend, in her late 60s, who had a medical emergency, probably an epileptic fit, who was sent to the local hospital. To help decide what her problem was and how best to treat it, the hospital conducted a CAT/ CT = computed tomography scan, a medical imaging technique used to obtain detailed internal images of the body. She was then released. That was about a month ago, and nobody knows where the results of the scan are located. Her meeting with her physician, has been postponed to determine the most appropriate medication, has been repeatedly postponed, in the hope that the scans will emerge.

In contrast, there is a tradition of smallish organizations developing software that they need. For example, Østfold Hospital, in south-east Norway, has developed its own apps to streamline work processes, which they estimate saves healthcare personnel an hour per day per employee. The apps move work processes from PCs and paper to hand-held devices = smartphones, which increases safety by reducing human error. The hospital has its own app center that has developed 17 apps from scratch, including an app for drug control and one for blood transfusion. The hospital has received inquiries from other health authorities that want to use the apps, and has already signed an agreement with one for a particular app. Employees at the hospital believe that simpler work processes with new apps can help stem the predicted health crisis.

In the late 1970s a group was working in Bergen to develop the DOC 110 system, one of the very first computer systems for electronic patient record keeping (EPR). This was soon supplemented by the PROMED system. In 1979, another EPR system, Infodoc, came onto the market in the Bergen area, and this quickly gained popularity, especially in Western Norway. Also at the Institute of General Practice in Bergen, a single-user version of an EPR system was developed and tested in two medical practices on a Tandberg platform.

In the spring of 1980, what even today can be considered a reasonably complete electronic patient record (EPR) system was put into use by a group of doctors in Balsfjord, a 1 500 km2 municipality, with a population approaching 5 600 in Northern Norway. The system, which was developed by a publicly funded project at the University of Tromsø in close collaboration with doctors from Balsfjord, was then transferred to a public company, Kommunedata Nord-Norge, for commercialization and further development.

In the early 1980s, another general practitioner started development work alongside his general practice, which led to the launch of the EPR system Profdoc in 1985, a system that quickly gained popularity. Thus, the period before 1988 can be called the pioneering period for the development of EPR systems in general practice in Norway.

While Norway had made a good start, there were far too many competing incompatible systems. It was almost as if each county, and sometimes smaller parts of counties wanted to have their own.

Use and distribution

This situation continued in the decades that followed. Fast forward to 2012. The Health Platform is a joint electronic patient record for municipalities, hospitals, GPs and contracted specialists in Central Norway. Work on the Health Platform began in 2012, and on 2019-04-01 the implementation project was formally launched and at the same time the limited liability company Helseplattformen AS was established. The project is part of the state’s follow-up to the white paper One citizen – one record, which was presented in 2012. Except, I am not sure that there is a primary key linked to every resident.

In 2022-05, the Health Platform was introduced by Trondheim Municipality in all service areas within health and welfare, and on 2022-11-12 it was introduced for use at St. Olav’s Hospital. As of 2023-12, the platform is in use in ten municipalities in Central Norway, and 16 municipalities have signed a financially binding service agreement.

The acquisition cost of the system was originally estimated at NOK 1.7 billion with an expected lifespan of 18 years. The total costs for the development and implementation of the Health Platform for the health authority, municipalities and GPs until 2024 are estimated at approximately NOK 6.7 billion.

Technology

The health platform’s technical solution is provided by Epic Systems, an American information technology company in health informatics, which supplies patient records and patient administration to, among others, the Netherlands, Australia, the United Kingdom, Denmark and Finland. The contract between Epic and Helseplattformen AS was signed on 2019-03-19. The login and access management for the platform is provided by IBM and the agreement was signed on 2019-06-03.

Epic was founded in 1979 by Judith R. Faulkner (1943 – ) with a $70,000 investment. Originally headquartered in Madison, Wisconsin, it moved its headquarters to Verona, Wisconsin in 2005, where it employs 13 000 people. The company also has offices in Bristol, UK; ‘s-Hertogenbosch, Netherlands; Dubai, United Arab Emirates; Dhahran, Saudi Arabia; Helsinki, Finland; Melbourne, Australia; Singapore; Trondheim, Norway; and Søborg, Denmark.

I imagine that development is taking place at two distinct levels. Changes are being made in Wisconsin, and changes are being made in Norway, and other places. There is probably little coordination between these efforts.

Criticism and scandals

Robert Kuttner (1943 – ) writes in The American Prospect that Epic’s market dominance is driven by its software’s ability to maximize profits for hospitals by facilitating upcoding, a form of healthcare fraud. The Department of Health and Human Services found that from 2014 to 2019, the number of inpatient stays billed at the highest severity level increased almost 20%, while stays billed at lower severity levels decreased. Kuttner argues that this drive for profit maximization leads to providers spending two hours entering data for every hour they spend with patients. Kuttner also reports that providers are faced with time-consuming training, alert fatigue, and mistakes stemming from copying and pasting from previous notes, ultimately leading to burnout and early retirements.

David Blumenthal, the National Coordinator for Health Information Technology from 2009 to 2011, said: The customers [of EHRs] were the chief information officers and the chief executives of hospitals, not doctors. Their principal goal was to protect revenues. Systems like Epic were not designed to improve quality because there was no financial incentive to do so at the time.

Since the system was also used in Denmark and Finland, it might be interesting to look at their experiences. These are taken from the English Wikipedia.

Danish experience

In 2016, Danish health authorities spent DKK 2.8 billion on the implementation of Epic in 18 hospitals in a region with 2.8 million residents. On May 20, Epic went live in the first hospital. Doctors and nurses reported chaos in the hospital and complained of a lack of preparation and training.

Since some elements of the Epic system were not properly translated from English to Danish, physicians resorted to Google Translate. As one example, when inputting information about a patient’s condition, physicians were given the option to report between the left and the correct leg, not the left and right legs. As of 2019, Epic had still not been fully integrated with Denmark’s national medical record system. Danish anesthesiologist and computer architect Gert Galster worked to adapt the system. According to Galster, these Epic systems were designed specifically to fit the U.S. health care system, and could not be disentangled for use in Denmark.

An audit of the implementation that voiced concerns was published in 2018-06. At the end of 2018, 62% of physicians expressed they were not satisfied with the system and 71 physicians signed a petition calling for its removal.

Finnish experience

In 2012, the Hospital District of Helsinki and Uusimaa (HUS) decided to replace several smaller health record systems with one district-wide system created by Epic. It was called Apotti and would be used by healthcare and social services for the 2.2 million residents in the HUS area. The Apotti system was selected as the provider in 2015 and implementation started in 2018. By 2022-11, the Apotti system had cost €625 million.

After the implementation, complaints from healthcare workers, especially from doctors, started accumulating. The system was accused of being too complicated and that its convoluted UI was endangering patient safety. For example, one patient was administered the wrong chemotherapeutic drug due to an unclear selection menu in the system.

In 2022-07, a formal complaint demanding that the issues in the system be fixed or the system be removed entirely was sent to the Finnish health care supervising body Valvira. The complaint was signed by 619 doctors, the majority of whom were employees of the Hospital District of Helsinki and Uusimaa (HUS) and users of Apotti.

Norwegian experience

Central Norway started introducing Epic (branded “Helseplattformen”) in November 2022-11. After approximately two months, the public broadcaster NRK reported that around 25% of the doctors at the region’s main hospital considered quitting their job, and that 40% were experiencing stress related health issues due to the new IT system. Previously, health personnel actively demonstrated against the software by marching through the city of Trondheim. Due to the chaos ensuing the introduction, including 16 000 letters not being sent to patients, the Norwegian CEO of the Helseplattformen IT project, Torbjørg Vanvik, had her employment ended by the board. Unexpected cost increases forced the authorities to decrease efforts in other areas, such as a planned initiative on mental health. Employee representatives state that the public will receive significantly worse services. A year after implementation over 90% of doctors in the affected hospitals considered the Epic system a threat to patient health, and hospital staff organised large protests at seven hospitals that had or were planning on implementing Epic systems.

In 2023 and 2024, the Norwegian Audit Office presented several reports on the Health Platform and described the planning and implementation as highly objectionable, the Norwegian Audit Office’s most serious criticism. The Norwegian National Audit Office stated that the Health Platform did not function as it should, and that its use could compromise patient safety, and that doctors and nurses should spend their time treating patients, not on double registration and double reporting in cumbersome IT systems.

Deaths related to the Health Platform

On 2022-12-16, a Danish-Norwegian architect died at St. Olvas hospital due to a stroke. Two days earlier, she had been discharged from the hospital without any information about which doctor and department were actually responsible for her at the time of discharge. There was early speculation about whether the Health Platform had a role in her death, but after the hospital’s report, the county physician stated that: The Health Platform has no information about the course, so there is no doubt that there is a weakness in the Health Platform that created an unclear course. When the case first happened, it revealed significant challenges with the Health Platform, but as I read it, I cannot say that it played a role in how the case ended. The Health Platform has in any case made it difficult to rule out a connection.

Illegal use of consultants

In 2022, the Trondheim based Adresseavisen newspaper revealed that the consulting company Ernst & Young had been paid NOK 557 million through a framework agreement that was estimated at NOK 70 million. This turned out to be illegal. It was also revealed that consultants and co-owners of the consulting company had central roles in the management of the Health Platform for several years. The newspaper believed that the violation could have been caught in 2019 when the CEO and several other employees received a legal memo by email that dealt with framework agreements like this. The CEO of Helseplattform AS explained that she had not opened the attachment in the email.

Torchlight procession

On several occasions, torchlight processions have been organized in protest against the introduction of the Health Platform. The first time was on 2022-10-17. The purpose of the demonstration was to prevent the platform from being used at St. Olav’s Hospital in Trondheim. On 2023-11-13, one year and one day after the medical record system was introduced at St. Olavs Hospital, new torchlight processions were organized in other locations with hospitals in Namsos, Levanger, Trondheim, Molde, Ålesund, Kristiansund and Volda.

On 2024-05-29, the Norwegian Medical Association adopted a resolution at its national board meeting requesting that the controversial data service Health Platform be investigated. It pointed to several inspection reports that state that the Heath platform creates challenges for patient safety and the working environment, that it creates additional work for health personnel, and that it entails large additional costs for the health service in Central Norway. The association wrote in a press release that it is seriously concerned about the consequences of continuing to use the service, and that its termination must therefore be investigated.

On 2024-10-24, the Norwegian Auditor General’s report on the introduction of the Health Platform at St. Olav’s Hospital was published. It concluded that the planning, organization and introduction of the new medical record solution in Central Norway is open to serious, but justified, criticism. The system is described by some as complete, but demanding. It is in line with the findings of an external evaluation report from 2023. So, I continually wonder, what sort of idiots decided they wanted to use this?

In a pompous rebuttal to the Auditor General’s report, the director of the Health Platform stated that it could not be replaced, just improved. This rebuttal began with the only photo in the document, that of the director! I disagree with his assertion.

As Østfold has shown, software does not have to be large, it just has to be appropriate. However, it helps if one uses small groups of professional staff to ensure that complex software is appropriately designed.

Conclusions

The political decision to implement the Health Platform has not been transparent. I do not understand why earlier, fully functional, Norwegian developed solutions have been abandoned for an inappropriate, American system.

Peripherals: An Update

A worker sitting at a desk, using an ERGO K860 keyboard, with a Vertical mouse. The monitor looks large enough for older users who would probably want something with a diagonal size of 27″ = 686 mm, or more. To the left, on-the-ear earphones. In my opinion they are not for older users, with any hearing issues. These are not particularly good, because: 1) they spread noise throughout the working environment, degrading that environment for others; 2) they are not particularly effective at preventing extraneous sounds from disturbing the worker. Personally, I use a gaming over-the-ear headset to listen and speak. I also have a Logitech camera, such as the one shown mounted on the monitor. While it can be used, it isn’t, so it has been removed, but kept. It can be fitted when needed. I agree that there is no need for a physical/ cabled connection of the keyboard or mouse with the computer. in my world these peripherals are connected using a Bolt receiver, that usually come supplied with the peripheral. However, sometimes they aren’t, so check before purchasing. Photo: Logitech

When personal computers first emerged they came with all the peripheral equipment that allowed it to run. This was absolutely necessary, because the computing unit and the periphery had to work together. To begin with, home computers had only a monitor/ screen, a keyboard, and possibly a cassette player for data storage. In many cases, rodents were not used. However, our original Amiga computers came with one. In the computer there was a 20 MB hard drive.

To connect a periphery to a computer, one has to be aware of two types of standards, signal and connector. Both standards have to be the same. Amiga video was the signal standard. It was similar to SCART, but different. It includes a digital RGBI signal, Genlock clock, composite sync and +12/+5VDC power. The connector standard was DB23. This was a unique D-Sub connector variant with only 23 pins for the video cables. Early Apple McIntosh computers used a similar connector, but with 25 pins. Amiga enthusiasts have been able to have this connector remanufactured, but such happy endings are rare.

There are two words that describe the situation with early home computers, but different. In time, ports became standardized. There were a lot of different legacy ports, but these have gradually been replaced, especially with USB-A and USB-C ports, along with ports for monitors (most often HDMI), and ethernet (RJ-45).

Computer manufacturers had to ensure that their products met both the signal and the connector specifications of the ports, allowing peripherals to become generic products that could be purchased separately. The company that made these products became a brand. If consumers were satisfied with one product from a brand, they would often chose other products made by it. If they were dissatisfied, they would find some other brand. Sometimes, people were satisfied with a brand, but that brand just disappeared, requiring people to find new brands and products. I cannot remember the process in detail, but over the years, I became satisfied with Logitech peripherals, and less satisfied with those of other brands, in part because they disappeared. Now, most peripherals (apart from monitors and printers) we use, and are mentioned here, are made by Logitech. Monitors offer the greatest variation, they were made by a variety of brands. We have one monitor each of: Acer, Asus, AOC, Benq and Samsung. The most common year of manufacture for them is 2012. The oldest is from 2007, and still works. Most of our printers, with two exceptions over a period of forty years, were made by Canon.

In addition to peripheral from Logitech, we also have some inexpensive peripherals often from Trust. Trust is a computer peripheral electronics company founded in 1981 in the Netherlands by Michel Perridon (1963 – ), under the name Aashima Technology B.V. to import computer accessories, game consoles and video games. From 1985, it started producing its own Trust branded products. It has been owned since 2018 by Egeria, a Dutch investment company. I appreciate Trust‘s low prices.

Note: If a model designation appears in this text without a brand name, that name is Logitech.

While younger workers, with better sight and hearing, can afford to be more fashion conscious in their choice of peripherals, older users should probably concentrate on functionality. They should chose peripherals on their ability to aid the user to do the needed work: especially, hearing what people are saying, and seeing what people have written. Their keyboards must allow them to reply quickly and accurately, and their rodents must help them navigate. If living spaces are shared with others, it is particularly important that sounds are not spread. What is incoming information for one person, can be regarded as distracting noise by another. Looking back at peripherals, one of the main differences over the past years, is that legacy ports and connectors are increasingly being replaced by legacy-free variants. Thus, when a person is considering the purchase of a new computing device, purchasers must be aware of how these devices will connect. There are physical adapters, that allow a peripheral to use a port it was not designed to connect to. As more computers add USB-C ports (and remove USB-A ports) USB-A female to USB-C male connectors will become increasingly important. In addition, some peripherals communicate through Blutooth. These protocols are back-over compatible. Many Logitech devices connect through a Bolt receiver. These (or at least mine) assume that a USB-A port is being used. Some people (but not me) are adapting them, so that they will fit a USB-C port. I still have enough USB-A ports on my computers, but not on my hand-held device. It is only equipped with a USB-C port, and a 3.5 mm audio jack.

Keyboards

There is one main reason to buy an ergonomic keyboard: health issues with one’s hands. Despite a diagnosis for osteoarthritis in several joints in both hands, I didn’t buy ergonomic equipment immediately. There were two reasons for this: 1) I was very happy with my K380 keyboard; 2) I wanted to ignore the health issues. Most days these issues were not serious, until they were.

Most ergonomic keyboards are expensive. Despite this, in 2021, I transitioned from a conventional M535 mouse and a stylish K380 keyboard to a more ergonomic MX Vertical mouse at NOK 1200, and an ERGO K860 keyboard at NOK 1 370. These are connected to the computer using a Logi Bolt receiver.

I am happier with the mouse than the keyboard. After more than three years of using the ERGO keyboard, I still regard the K380 as my favourite. Even after using an ergonomic keyboard, I purchased a second K380 keyboard, so that if I should transition back, I would have a lifetime supply. In addition, I purchased a similar MX Keys Mini keyboard in pink, with a matching MX Master Anywhere 3 mouse. The keyboard was bought used, at half price, from someone moving from Norway to Belgium, who needed an American keyboard for programming.

Trish does not have the same issues with her hands. She also has a MX Keys Mini keyboard, but in graphite/ gray, with a blue M177 mouse. This is her daily drive with her desktop machine. I have purchased her a Lift mouse, which is smaller than my MX Vertical mouse, that she can use when her current mouse wears out.

We also have two other keyboards for use with our equipment. There is a K400+ keyboard that is specifically designed for use with televisions (which is where it is used), and a K480 keyboard, which is similar to the K380 keyboards in its design, and in allowing Bluetooth connectivity. This is a dedicated keyboard for use with our Zenfone hand-held devices, or other hand-held devices, including tablets. This is useful because smartphone keyboards are excessively small and awkward to use.

All of these are ISO keyboards with Nordic features, allowing us to write in English or Norwegian without difficulty. ISO keyboard are often used in Europe to support various languages, and for many Europeans it works much better than an ANSI keyboard, that is preferred by English language users.

When I attempted to find out what assorted keyboards were preferred in Asia, the general advise was to use a US qwerty keyboard. Following this up, for Chinese I discovered that Chu Bong-Foo (1937 – ) invented a Cangjie input method in 1976, which assigns different “roots” to each key on a standard computer keyboard.

For other languages, including Hindi, Japanese and Korean, there are other input method editors that can produce appropriate content. Again, many of these rely on a standard US Qwerty keyboard.

An ISO keyboard has been part of my life since the acquisition of our first personal computer, an Amiga 2000 in 1986. That keyboard was Norwegian specific. The Amiga 1000 that we borrowed before our own computer arrived, was probably not ISO. At some point, Norwegian language keyboards ceased to be easily available, and were replaced by Nordic keyboards.

Danish, Finnish, Norwegian and Swedish have 3 additional letters, 29 in total: Å is in the same place in all four languages, located to the right of P on the keyboard. Swedish and Finnish use the same layout, with Ö and Ä following consecutively to the right of L, as do the Danish Æ and Ø, and the Norwegian Ø and Æ. Yes, Danish and Norwegian have two of these extra letters in the reverse order. Icelandic uses the same keyboard, but has 32 letters, and a much more complex arrangement.

During the operating system installation process, users are typically polled about the keyboard layout to be used. The language to be used is a separate question. If necessary, this information can be changed later.

This image has an empty alt attribute; its file name is physical_keyboard_layouts_comparison_ansi_iso.webp

There are five differences between the keyboards, as shown in the table below.

TraitANSIISO
Enter/ Return keyShort and wideTall and narrow
Left Shift keyHalf the width of right shift keySame width as right shift key
Backslash keyAbove Enter keyLeft of Enter key
Right of Space barRight Alt keyAlt Graph key
Number of keys104 (Full)/ 87 (Compact)105 (Full)/ 88 (Compact)

People experience varying degrees of difficulty transitioning between ANSI and ISO keyboards. One of the laptops I disposed of (prematurely?) had an ANSI keyboard because I found it problematic to use.

Logitech offers ANSI keyboards for British/ American users, but ISO keyboards for other European language users. While I find the Ergo K860 comfortable to type with, there are many other manufacturers of ergonomic keyboards. Note: the photo on keyboard packaging may be deceptive in terms of ISO/ ANSI, check the keyboard itself, before purchasing.

By accident and to my surprise, I discovered that many people wash their keyboards in sinks, or even dishwashers. Precautionary suggestions vary from none to ensuring that cords/ cables are covered in plastic, that cool water be used, that no or a mild detergent/ soap be used, that a dishwasher’s top rack be used, that the keyboard be given a week to dry thoroughly. None of the above advice is mine, and it comes with no guarantees! Yes, I have on occasion used a damp cloth to remove dust (and other contaminants) from the surface of my keyboards.

Membrane Keyboards

I use the adjective, sedate, to describe keyboards that use membrane switches. Membrane keyboards have a life expectancy of about 5 million key presses. If a person types minimally, say 1 000 words a day = 5 000 characters, such a keyboard will last almost three years. In contrast, a mechanical switch can last 50 (Gateron) – 100 (Cherry) million key presses, which at the same production rate means they should last over 27 or 54 years, respectively.

So here we are at the end of 2024, and my K860 membrane keyboard still works flawlessly, despite it having logged at least those 5 million key presses. While I know that many products work until they don’t, I have difficulty believing that I have reached anywhere near its end of life. Half way, at most! So, I expect it to keep on working until 2028, when I celebrate my 80th birthday. Then, well I might treat myself to something else. I might just revert back to using my favourite keyboard, a K380.

I suspect that most readers of this weblog post do not have such excessive production rates, as I do. Yet, they still face two contradictory impulses with respect to their keyboards, even if they works satisfactorily. The first is tedium/ boredom. How many years does a person want to interact the same keyboard? The second is the reverse, novelty. How many keyboards does a person want to experience? For me, I want the keyboard to respond to my touch in a specific way. I touch the keys lightly, and know precisely how far I have to press down to activate each key. I want the keys to respond silently. I have been touch-typing for over sixty years.

Mechanical Keyboards

In describing the world of mechanical keyboards, I never know which adjective to use: frenzied, obsessive or hyper. This section is possibly longer than necessary for many older users who are content with membrane keyboards. However, some younger people, who are still out in the working world may prefer to gain some insights about them.

In offices, production rates allegedly vary from 8 – 22 000 keystrokes per hour, for data entry tasks, to about 2 000 keystrokes for other more general office workers. With an effective production of 20 000 keystrokes per hour over a six hour day, this results in 120 000 keystrokes per day, or perhaps about 25 million keystrokes per year, which would give a life expectancy of between two years and four years for a mechanical keyboard. A membrane keyboard would not be acceptable in such an environment!

The active ingredient on a mechanical keyboard are its switches, one for each character. There are three standard types: linear, tactile and clicky. Linear switches are simple, they lack tactile/ audio feedback when they reach the actuation point, where the key press is registered. Tactile switches provide tactile feedback, commonly referred to as a bump, when hitting the actuation point. Clicky switches provide tactile and audio feedback when they hit the actuation point. The feedback provided by both tactile and clicky switches reduces typing effort. One is continuously aware of how much effort is needed to register a key press, so a user can type faster. People who do not want to disturb others with audio effects will choose linear or silent tactile switches, typically referred to as brown switches.

There are experiences that lead me towards or away from many manufacturer and particular products.

Switches are available from Logitech, Kailh, Cherry, Gateron and Epomaker. There are a lot of potential mechanical keyboard contenders. Some of the ones I have considered acquiring are: Logitech POP Keys with cloned Cherry MX Brown switches from Trantek Electronics Co. Ltd of China; Drop ENTR with switches from Kailh; Keychron Q8 (Alice) ergonomic keyboard with Gateron Pro G Brown switches; and Epomaker TH80 a slightly less ergonomic keyboard with their own Flamingo linear switches.

With so many Logitech keyboards in my collection, I was initially attracted by the appearance (blast yellow & emoji keys) of the POP equipment, but not the keyboard or mouse functions. Unfortunately, there are numerous issues with the keyboard. One review indicate that the keyboard is tricky and unforgiving to type on, and replacing useful key functions with five emoji shortcuts is just a novelty. The keycaps don’t seem particularly well made, with legends pad printed, a method criticized for its poor wear. Dye sublimation or double-shot molding is preferred. There are no height adjustable feet. It does not come with back lighting. It is not an ISO keyboard. The keyboard costs about US$ 100, plus shipping. Gimmick describes the keyboard in one word!

This image has an empty alt attribute; its file name is Logitech-Pop-Keys-ISO.jpg
A Logitech POP mechanical keyboard, complete with emoji keys, and matching POP mouse in blast yellow. The keyboard may look cute or even attractive, but it is inferior to most other mechanical keyboards. Photo: Logitech.

Drop was founded in 2012 in San Francisco as Massdrop by Steve El-Hage and Nelson Wu, who met in Toronto. It changed its name to Drop in 2019. For me, the problem with Drop was the initial propaganda I encountered. It showed a tube-based amplifier beside a keyboard. This was a danger signal, as I am not a member of any tribe using old-fashioned tubes. Yes, I belong to the transistor generation!

If one can overcome that initial prejudice, the ENTR keyboard is a Ten Key Less (TKL) = 80% board, which lacks some keys, some say 17 rather than 10, in relation to a full-sized keyboard. The missing keys are not used much, and there are workarounds for them. Ergonomically, a TKL is considered much better for productive typing, than a full-sized keyboard. The challenge for me, is that it is an ANSI keyboard, which makes it, by definition, unsuitable. It also costs US$ 100, plus shipping.

This image has an empty alt attribute; its file name is Drop-ENTR-1024x683.jpg
A Drop ENTR mechanical ANSI keyboard. Technically better than a Logitech POP. Photo: Drop.

The Keychron Q8 Alice ISO-Nordic keyboard is a good mechanical ergonomic keyboard. Many claim that this is a 65% keyboard, because its keys that are based on a 65% keyboard that is split into right and left halves, with a gap between them. The two sides are angled and tilted upwards. Other ergonomic keyboards tilt downwards. At a price over US$ 200, plus shipping, it is an expensive keyboard. Yet, given its durability, it is probably a suitable investment for anyone who writes a lot and expects to live longer than six years more, the life expectancy of two Logitech K860 keyboards, at the rate of 5 000 keystrokes a day. One advantage of this keyboard is its mass. The keyboard is constructed of CNC machined 6063 aluminum with a mass of 1 820 g. This keyboard stays firmly in place when typing, while lighter keyboards have a tendency to move. Quantum Mechanical Keyboard firmware (QMK) can be customized/ programmed/ mapped with a VIA configurator/ programmer/ app. Both the firmware and the programmer are open-source. 

This image has an empty alt attribute; its file name is image-3-1024x1024.png
A Keychron Q8, with an Alice ISO Nordic layout, additional/ substitute keys available, and details of the Gateron red switches, commonly supplied. Photo: Keychron.

The Epomaker TH80 offers a keyboard that is slightly less ergonomic, but more affordable at US$ 100 plus shipping. It offers more keys than the Q8. Some regard it as a 75%, others an 80% keyboard. The keys are hot swapable, which is probably uninteresting to anyone over the age of 30, not engaged in gaming. There are three ways to connect this keyboard to a computer including 2.4Ghz WiFi/ Bluetooth 5.0/ USB Type-C wired connector. Keycaps are made from polybutylene terephthalate (PBT) that tolerate high temperatures (150 C), resist solvants, are mechanically strong and long wearing with a matte finish. On the negative side, they are usually more brittle, and resonate more/ sound louder when typing. A MDA profile indicates a keycap design that emphasizes a uniform concaveness. The individual keycaps have a wide and flat surface. They are more suitable for typing than gaming. These keycaps have dye-sublimated legends. Another feature of the keyboard is it RGB effects using south-facing (towards the typist) software-programmable LEDs.

This image has an empty alt attribute; its file name is TH80ISO_1.png
An Epomaker TH80 ISO keyboard, but with an American, rather than Nordic, layout.

Rodents

An Ergo Vertical mouse is my companion rodent. Propaganda from Logitech told me it works at the ideal angle for a hand = 57°. I find it comfortable to use. However, not everyone may be in agreement, particularly people who prefer to use their left hand with a mouse, or people with smaller hands.

Right-handed versions of the Logitech Lift are available in three colours: rose, off-white and graphite. A left-handed is available in graphite only.

The Logitech Lift, and a minimally different Lift for Business, are suitable for people with small to medium sized hands. Some of the differences between Lift and Vertical, include: Lift is made out of plastic, while Vertical is made from rubber and aluminum. Lift can only be used wirelessly, while Vertical can also be used with a USB-C cable. Lift has a replaceable battery, Vertical is rechargeable. Lift is good for 3 million clicks, Vertical for 10 million.

Variants of both the Vertical and Lift are made for people who favor their left hand, although colours may be limited. Further information about mice for people with large hands can be found here, which includes a reference to a Levkey left hand mouse. Similar information for people with small hands can be found here.

My experience with rodents, is that they seldom fail, but when they do they have reached the end of their life. Appropriate replacements take time to acquire. While we do have local stores that sell them, most of the mice available locally do not meet my specific requirements. For this reason, I always have an extra mouse on hand that I can use when required. My current reserve rodent is a MX Anywhere 3. It is often classified as a travel/ compact mouse. The two main attributes that are often highlighted are its speed and accuracy. Mine is pink to match a MX keys mini keyboard, which was also acquired if my primary keyboard should ever fail.

For computer aided design, and well as other drawing inputs, I use a Wacom One pen tablet. It comes with a pressure sensitive and lightweight pen, without batteries, or a need to charge. It connects with a USB cable to any Windows, macOS, Chromebook or Linux PC. I have only used it with a Linux machine, but there are built-in drivers that just work. Propaganda from Wacom says that it is bundled with apps specifically tailored for education. I have no idea what they are talking about. However, it did work with Krita, the Swedish graphic program I was using. Size (W x D x H): 277 x 189 x 8.7 mm, without tag and rubber foot, with an active area (W * D): 216 x 135 mm.

There are more expensive products, that may be suitable for graphics and some other professionals. For non-professionals, the Wacom One is gudenuf!

Flatscreens

In 1969 I worked as a stockbroker trainee in Vancouver, where I had regular contact with a Mitsubishi lumber buyer. He described a flat-panel display he had seen in Japan, that was being developed by his company. He expected it to be on the market by the mid-1970s, at the latest. He was a bit optimistic, as it took about thirty years. In the US, Westinghouse had already developed a electroluminescent display (ELD) was made using thin-film transistors (TFT), but it was not ready for prime-time.

The brightness of a monitor is measured in candela per square meter. Since this is long and complex, the unofficial term, nits, is often used. Since candela measure light intensity. The value in nits shows how bright a screen appears. Nit comes from the Latin verb nitere = to shine. This monitor provides 300 nits, which is within the normal range of between 200 and 600 nits for laptop screens and monitors. 300 nits is considered to be most appropriate. It offers good visibility, shows colours well and prevents eye strain.

Almost all new monitors have LED backlighting. WLED just means white light emiting diode, that I consider a marketing gimmick. While gamers and video editors like to have the fastest possible refresh rate, this is not so important in office situations. A refresh rate of 100 (and sometimes even 60 Hz) is fully acceptable. This monitor has a refresh rate of 144 Hz.

There are two different pixel response benchmarks for monitors: Grey-to-Grey (GtG) and Moving Picture Response Time (MPRT). GtG represents how long it takes for a pixel to change between two colors, while MPRT represents how long a pixel is continuously visible. While GtG pixel response time has improved, MPRT has not gotten faster because MPRT is limited by the refresh cycle and by the frametime. Here, the response times are: GtG = 4 ms, while MPRT = 1 ms.

Contrast ratio refers to the difference between the minimum brightness and maximum brightness of a monitor. For example, looking at a monitor with a 1 000:1 contrast ratio, a white image would appear 1 000 times brighter than a black image on that one specification.

There are no official test procedures to find a contrast ratio. Static contrast ratio, attempts to measure the difference in luminosity, comparing the brightest and darkest shades the system is capable of producing simultaneously. Dynamic contrast is the luminosity ratio comparing the brightest and darkest shade the system is capable of producing over time while a picture is moving. Here, the contrast ratios are: 3 000:1 static; 80M:1 dynamic.

Almost all monitors specify a viewing angle of 178 degrees.

Manufacturers are always enthusiastic about how many colours their monitors can reproduce. 16.7 million is a typical answer. This means that 24-bits are used to define the colour of each and every pixel, eight bits each for red (R), green (G) and blue (B). If there are slightly over one billion colors, this means that 30-bits per pixel are being used; 10 bits each for R, G and B. In the real world, this is only used in equipment for video and photo editing.

In the beginning (2010) Ken Birman developed vertical synchronization (V-Sync), display technology software designed to help monitors prevent screen tearing, a situation that arises when two different image components interact/ because the monitor’s refresh rate can’t keep pace with the data being sent from the graphics card. It causes a cut or misalignment to appear in the image.

By 2013 Nvidia developed G-Sync, which is a proprietary hardware solution to the same problem. G-Sync works only with Nvidia graphic cards. In 2014 AMD developed FreeSync, which was a royalty free alternative to G-Sync. It works with AMD as well as Nvidia graphic cards, and on some consoles. Thus, it is a more flexible solution. The one concession to gaming in this monitor is its use of FreeSync, which works between 30 and 144 Hz.

Overexposure to blue light is claimed to cause problems that range from dry eyes to eye strain, sleep cycle disruption as well as macular degeneration, which can cause partial blindness. Some people blame these problems on the overuse of computers, rather than blue light. Regardless of its merits, many manufacturers are reducing the amount of blue light being send out.

If the display cannot be positioned satisfactorily, then one must consider repositioning the desk. Ideally, a display should be placed at right angles to, or away from, windows and other light sources so it does not create/ reflect glare. Glare may cause eye strain. Once this is done, the display should be centered directly in front of the user.

When this wait for flatscreens was finally rewarded, Cliff Cottage was eventually populated with them, but it took time, because they were so expensive to start with. The life of a flat-screen monitor probably exceeds fifteen years. Thus, people may want to ensure that what they purchase is suitable for their future needs.

One notable product was an LG 24″/ 60 cm flatscreen monitor for a media centre purchased in the early 2000s. It lasted until it was replaced with a Samsung 40″/ 100 cm model from 2010, purchased used in 2012. This machine was in use until a 50″/ 125 cm model from NetOnNet, Anderson QLED5031UHDA was purchased 2023-11-17. It has 4K = 3840 x 2160 pixel resolution. In terms of sound, it is equipped with Dolby Atmos and DTS:X surround sound standards.

There are three common types of LCD panels: twisted (TN), in-plane switching (IPS) and vertical alignment (VA). This monitor uses a VA panel. Compared to a TN panel, it has a deeper-black background, a higher contrast ratio, a wider viewing angle, and better image quality at extreme temperatures. Compared to an IPS panel, it has deeper black levels that allow for a higher contrast ratio, which are 3 000:1. However, the viewing angle is narrower, and color and especially contrast shifts are more apparent.

Office monitors have changed considerably over the past years. With a conventional office desk, an ideal screen size used to be 24″ / 61 cm, then 27″/ 70 cm or less. If more screen area was required up to three such screens could be parked on a desk, so that multiple data windows could be opened simultaneously.

Numerous laptops, and assorted flatscreen monitors have also been purchased. For example:

Trish uses a Benq GL2450 T, which was made 2015-04. My records do not indicate when it was purchased, but probably sometime in 2015. It has a 24″/ 61 cm screen, with high definition 1920 x 1080 pixel resolution, using TN technology. It offers 250 cd/m2 brightness, which is now regarded as a low value. Dynamic range = 1k:1 – 1M:1. Energy use = 50 W, 0.3 W in standby. The response time is 5 ms. which is adequate for office use, but too low for other uses. It also comes with a DVI-D connector, rather than an HDMI connector. This is manageable because there are adapters between DVI-D and HDMI. It functioned adequately, but is not height adjustable and does not have many of the features found on modern monitors.

For many years I used a Samsung SyncMaster S27B350 monitor that was purchased 2012-11-09. It has a 27″/ 686 mm HDMI screen. Other differences between it and the Benq discussed above are: Brightness = 300 cd/m2. Energy consumption = 40 W. Its response time is 2 ms.

Because I have some vision issues related to blue-light exposure, it was replaced 2020-10-07 with a AOC Q27P2Q, a 27″ monitor with 150 mm height adjustment, using ISP panel technology displaying 2560 x 1440 pixels, with low blue light technology. Dimensions: 528.2(H) x 808.4(W) x 237.4(D);Tilt: -5/ 23; Height adjustment 130 mm; swivel -32/ 32. Power: 67/ 0.5/ 0.3 on/ standby/ off, respectively. Mass: 8 100 g. The ACO screen has also wavered in price. It was purchased for NOK 2 300. Soon after my purchase it increased to about NOK 3 200, then it fell once again to NOK 2 400. As this post is published it costs NOK 2 700.

When I purchased this new monitor, I was aware that it wouldn’t work with my KVM = keyboard & video & mouse, a device/ switch that allows up to several computers to share the same peripheral equipment. My KVM, an Aten CS692, has 2-ports allowing it to work with two computers, but the screen size is restricted to HD = 1920 x 1080 pixels. Fortunately, if I am desperate to use two computers simultaneously, I have up to several old HD monitors available, that work with this HD KVM.

Depending on one’s activities, only extraordinary circumstances dictate that a person should have more than one keyboard, monitor and mouse at their workstation. KVMs can be purchased (or in some circumstances made) that will meet the individual needs of any user. Many KVMs can be used to access computers remotely. While my desktop machine is adequate for most purposes, it would not be suitable for either gaming or video editing.

Today = 2024, people are going over to ultra-wides. This allows for a 35″/ 90 cm screen, that fits on a conventional office desk. At some point Trish’s monitor will either wear out, or she will realize that it has passed its use-by date. Thus, I could offer her my AOC monitor, and purchase an AOC CU34P2A 34″ ultrawide buet WQHD gaming monitor. This monitor was selected for discussion, because of its price: It was under a magic NOK 4 000 = US$ 360 (at publication).

Diagonal size: 34”/ 864 mm with the curvature equivalent to a radius of 1 500 mm, normally described at 1500R. There is a 797 x 334 mm viewing area, populated with 3440 x 1440 pixels. This has an aspect ratio of 2.37:1, but is marketed as 21:9. It consumes 55 W when in use, and 0.3 W on standby.

When discussing a monitor for office purposes, it is important to remember that the standard paper document size in Europe is A4 = 210 x 297 mm = 8.3 x 11.7 inches. It is a substitute for 8 1/2″ x 11″ paper found in North America. An ultra-wide monitor allows for up to several rows of tool, memory and other bars, at the top and bottom, as well as the display of four documents with their width only slightly compacted, but with their length in full size.

In-Plane Switching (IPS) technology should be used to provide accurate colour and a wide (up to 180°) viewing angle. It should be easy to height-adjust, tilt, pivot and swivel the display. In addition, the display should be flicker free and have reduced blue light, especially after dark. Some find it advantageous for a display to have built-in stereo speakers. However, headsets should be provided and used whenever two or more people share/ occupy a room.

Headsets

Clarification: I am not, and have never regarded myself an audiophile. In fact, I am opposed to using the term, because those self-appointed lovers of sound far too often promote nonsense,

There are many different types of headphones, that distinguish themselves in terms of their ear cups. The are: closed-back, open-back, on-ear, in-ear (ear buds). Headphones are used for different purposes. A performing musician may want something entirely different from someone watching YouTube videos on a computer.

Closed-back headphones are designed with sealed ear cups that offer excellent noise isolation. They also prohibit the user from hearing people and other things within their environment.

Open-back headphones, have perforated ear cups that allow air and sound to pass through.

Open-ear headphones are designed to direct sound waves directly to the middle ear, avoiding your eardrum. Some users state that these make it possible for a user to immerse him/ herself in content while remaining aware of your surroundings. These are useful for people who are active outdoors.

On-ear headphones rest on the ears rather than enclosing them. It is claimed that they strike a balance between portability and comfort, offering good sound quality without fully sealing off the ears from the environment. These may be suitable for younger users, but for people with a hearing disability, they can be detrimental.

Over ear headphones envelop the ears completely, providing noise isolation and immersive sound. This is my preferred choice of headphone.

In-Ear Earbuds are small, lightweight headphones that fit directly into the ear canal. They are highly portable and easily transportable and very compact, but due to the small size of the speaker, they can’t create true sound isolation.

In-ear headphones, also known as in-ear monitors (IEMs), fit snugly into the ear canal, providing excellent noise isolation and sound quality. These go deeper into the ear canal than earbuds.

When I was in the market for a headset, many sites with reviews about headsets for the hearing impaired suggested versions of Audio Technica products, commonly the ATH-M50X at NOK 1 100. However, these are headphones for listening, without a microphone for speaking. These could be complemented with an Audio Technica ATR3350iS omnidirectional condenser lavalier microphone, that comes with an adapter, allowing it to be used with handheld devices. These cost almost NOK 550, for a total price of almost NOK 1 650.

This seemed expensive, and I started to investigate office and gaming headsets. Even the Logitech G433 and the Logitech G Pro X also seemed too expensive, at NOK 1 250 and NOK 1 350 respectively. I decided that I could stretch myself to buy a Logitech G Pro at NOK 1 000, as a compromise. However, on the day I decided to buy one, the G Pro X came on sale at NOK 900, which was lower than either the G Pro or G433. It was purchased on 2020-10-07. Then I purchased a second, but wireless, for NOK 1 300 on 2023-12-31. Todays prices: The G Pro X wireless headset is NOK 1 700, while the wired variant costs NOK 1 100.

Printers

There are two types of printers that are suitable for individuals, families and most small businesses. These are colour laser and ink-jet.

I have never had a relationship with Hewlett-Packard (HP), in part because my employers, Møre and Romsdal county and North Trøndelag county bought and used many of their products. Unfortunately, these often had reliability issues. Thus, when it came time for us to buy printers we chose Canon.

Ink-jet printers are usually inexpensive, because manufacturers make their money from selling cassettes. The advantage of a laser printer, in contrast to an ink-jet printer, is that while the ink cartridges are more expensive, they do not dry out.

We have a Canon i-Sensys MF633CDW colour laser printer with a scanner. We print out very few pages a year, yet there is never a problem when we do so. Most of the time, the machine is used as a scanner.

I imagine this will be the last printer we purchase, as long as we can buy cartridges that fit the machine. We buy remanufactured cartridges from Yaha, a company located in Arendal in Agder, in southern Norway. They even offer cartridges that even fit ancient printers.

Our relationship with Yaha dates to 2008-02-16, when we bought our first laptop computer, an Acer Travel Mate 5520. It provided a 15.4″ screen, with 2 GB RAM and 160 GB hard disk drive storage. The invoice states that it came with a Windows XP operating system. It cost NOK 6 500. When I look at the details, it appears that this machine was destined for our daughter, Shelagh. A month later, we bought a second machine, with a 17″ screen. It appears to be for our son, Alasdair.

Needs/ Wants

Because I have the opportunity to do so, I prioritize the purchase of computer equipment beyond minimal household needs. While these could be considered (and budgeted) as part of the computing infrastructure, a more honest appropriation is to consider them as hobby electronics expenditures.

Soon, I hope to relocate a Behringer MS-1 synthesizer (purchased 2022-03-25) beside my desktop machine, so that they both can take advantage of the same audio equipment, including a Native Instrument Komplett Audio 6 interface (purchased 2020-11-12). If I should want to share sounds locally, there is Red, an Angry Birds speaker (An audiophile would add that it is a Hybrid mesh PRO-G 50 mm speaker. it comes with a neodymium magnet, providing a frequency response between 20 Hz and 20 kHz, with an impedance of 35 ohms.

Personas

Devices have personalities. I refer to them as third level personas. Persona, was first documented as a word in 1905–10. It is derived from the Latin, persōna, meaning mask or character. The first two levels are people and pets. When we first started using computers we named them after Agatha Christie characters, such as Jane (Marple). Once that list had become exhausted, we named them after departed relatives. This can create some challenges. When it came time to name a computer after my maternal grandmother, Jane Andison nee Briggs, that name had already been used. Now, I have begun naming devices after qualities. The weblog post is being written on/ with Enigmata, which refers to a puzzling or inexplicable occurrence or situation. Lingists will appreciate learning that enigma came into the English language in 1530–40. It is from the Latin aenigma, from the Greek aínigma, equivalent to ainik- (stem of ainíssesthai = to speak in riddles, derivative of aînos = fable) + -ma noun suffix of result.

Yes, there are gamers who do have their distinct requirements, usually specified in terms of graphics and latency (time delay), and other explicit requests. There are some/ many Macintosh users, who answer Apple, even before any question is asked. Then there is a category called most people, who simply accept whatever machine and operating system combination some salesperson/ website is promoting that day – Chrome OS on inexpensive, Windows on mid-range and Surface on more exclusive machines. Linux? Well, that takes effort.

In addition to backup files on the Mothership server, copies are stored on assorted external drives. Recently, and for the first time, an external harddisk drive (EHDD) has failed. It was a Seagate Backup Plus 4 TB unit purchased, 2016-11-22. It had been used minimally. Since then, no other EHDD has failed.

Part of the challenge of thriving with a computer is a function of one’s age. The first time I heard that voice control would replace keyboards was probably around 1978. It would take five years, an expert had suggested, 1983. Since then, thirty-six years have passed, and still I buy keyboards regularly. I now expect to use a keyboard as my primary input device for the duration of my life.