Clouds & Puddles

Clouds are not always cute. Sometimes they are threatening, with the potential to damage, injure and even kill. Inderøy, Norway 2015-08-11. Photo: Patricia McLellan.

This weblog post is written to help people gain a better understanding of a house/ home/ residential computer network.

The Xample family with mother Ada, father Bob, cat Cat and daughter Deb are going to serve as an example. Currently, the family has the following equipment. Both Ada and Deb have iPhones, but Bob has a Samsung phone. In addition, Ada uses a Mac laptop, Bob has an old Acer desktop machine, while Deb uses a Chromebook laptop, that belongs to her school. The family is connected to the internet with a router, and they have an old Canon printer.

Some basic vocabulary

Sometimes, language can be confusing. To communicate about computers, people have to use words the same way, or misunderstandings will occur.

Users are people. Admittedly, there could be situations where something other than a human is a user. For example, at some point in the future Cat might be able to activate a feeding station. That would make Cat a user.

A computer, frequently known as a device, consists of hardware and software. Hardware is made up of physical/ mechanical components. Yet, these components will not work without software, a collection of instructions and associated data, that provide services.

Software capable of doing something is called a program. A computer program that is running/ working is usually referred to as a process. Software is written in a computing language that humans can read, and programmers can understand. This source code is then translated into machine code that the specific hardware on a machine can understand.

Operating systems are an important category of software. An operating system is the most fundamental program. It controls the functioning of the hardware and direct its operations. It manages the hardware including component parts, but hides the details from users. The Xample family, like most families, use several different operating systems. The iPhones use iOS, the Samsung phone uses Android, Ada’s Mac laptop uses macOS, Bob’s desktop uses Windows 7, Deb’s Chromebook laptop uses ChromeOS. Both the router and the printer also have their own special operating systems.

Networks

The term network can be used as either a noun or a verb. Begin with the verb, to network. This is a situation where two or more devices are able to exchange information with each other. The noun network refers to the sum of devices that are connected together. When networking, each connected device is referred to as a node. In the above case, an end node is connected to a network node, the iPhone and the router, respectively.

Nodes are connected to each other using data links. These links may be wired or wireless. The most common type of wired link is the Ethernet cable. The most common type of wireless link is WiFi. While some houses have only WiFi links, there can be good reasons for using cabled links if there is an opportunity for it.

Nodes don’t need to have a direct connection to each other, to be part of a network. Thus, one node could be located in Bergen, another in Detroit, and a third in New Westminster. This is an example of a wide area network (WAN). If the nodes are all located in the same cluster of buildings it is a local area network (LAN). For example, there could be several nodes inside a house, but one in a garage and another in a shed.

Computer professionals often use the terms client and server to describe a computing model. These are confusing terms. Fortunately, most people do not need to use them. The important thing to know is that a client either does local work or requests an external service. A server either provides that service directly to a client, or supervises an external supplier working on behalf of a client. Both terms can refer to either hardware or software. Focus on the software to understanding what is happening.

To help the Xample family transition to a network suitable for them, we are going to look at four challenges facing the family.

Challenge #1 Sending an email with an attachment

Ada has struggled for the past year to knit herself a sweater. It is finally finished, and she has taken a photo of it with the camera on her iPhone. She wants to send a copy of the photo to her brother Ely.

When Ada uses her phone to write an email, she is using an email client that is built into the iPhone. This client also allows her to attach the photo. When it comes time to send the email and its attachment, Ada uses an email server built into the iPhone that allows it to use the router to send out and receive emails.

In this example, there are two communicating devices, Ada’s iPhone and the family router. There is a network, even if it is small, simple and temporary. The two devices are connected using WiFi,

The router breaks both the email message and the photo into data packets. Each packet is equipped with a coded form of Ely’s email address. To find out where to send information, the router looks up an address using a routing table. If Ada receives an email from someone, the router will reassemble incoming data packets so that these can be understood. At home, most people use a digital subscriber line (DSL) or cable router connected to the Internet through an Internet service provider (ISP).

A DSL router typically integrates a modem. A modem sends packets of data across telephone lines, TV cables or optical fibers. A modem, by itself, does not provide the functions of a router.

Challenge #2 Printing

Ada wants to take a copy of the photo on her next visit to her grandmother Fay (1919 – ). Fay is not computer literate, but likes to decorate the walls of her room with photos. Before looking at what is happening in detail, we are going to learn a few more terms. To print a letter on paper, a print server will coordinate the printing process.

The main challenge with the family Canon printer is that it is so old that it doesn’t have any WiFi connection and won’t connect directly to the iPhone.

Ada connects her iPhone to her MacBook using the iPhone’s charger cable. She plugs the charging end into the iPhone, and the USB end into the MacBook. She then opens her Mac Photos app on her Mac laptop, clicks on Import, selects the photo she wants to transfer, clicks on Import (#) Selected, then finally clicks on Albums. Now the photo is on her laptop, and Ada can disconnect the charger cable.

To print the photo, Ada takes a USB cable, permanently attached to the printer, and plugs the other end into a USB port on her computer. Using a printer program and drivers, previously installed on her laptop, she can now print the photo. By plugging in the cable, Ada has once again set up a small, simple and temporary computer network. This time, it consists of the MacBook laptop and the Canon printer.

Challenge #3 A Permanent Network

Deb has her bedroom upstairs. If she wants to use the printer she has to take her Chromebook downstairs to attach it into the printer, using a USB cable. This is inconvenient. Most individuals/ couples/ families need system resources that can be shared by all/ many/ some users effortlessly. The basis for a permanent network may already be in place with the WiFi capabilities that are built into most domestic routers.

WiFi is a set of standards that allow devices to communicate with each other wirelessly. Devices that can use WiFi include desktops, laptops, smartphones, tablets, smart TVs, printers, digital audio players, digital cameras, cars and even drones.

Some families may consider replacing their printer with one that has WiFi capabilities. An alternative approach is to keep the printer, but to invest in a Network Attached Storage (NAS) server. A NAS can act as a print server, letting approved users print on a common printer.

Equally important, it can also act as a file server, so that common files can be stored in a central place and used by everyone. Such files include media files: video files, audio files, e-books; family photographs, and documents used by the entire family.

Everyone in the family will have to become users of the NAS, with their own log-in. Be sure to add an additional user, called Guest. Not all users are treated equally. Users have to log their client devices onto the system with a user name and a password, or some other approved form of identification. In this way, random visitors are prevented from accessing the server and its resources, without permission. Guest will typically be able to access the internet, but not able to access files on the NAS or use the printer.

A NAS can also backup personal files. These backups can be encrypted and password protected,so that they are unaccessible to others.

The Xample family decide that a QNAP TS-251A best suits their needs. They equip it with 4 TB of storage, which they regard as adequate for their needs. The printer can now be permanently connected to the NAS using a USB port, given access to users. Many printer drivers are instantly available, although older printers may require some effert to download appropriate drivers. If the printer is compatible with the NAS, it will display a message to confirm that the printer is connected.

Challenge #4 Clouds & Puddles

The Xample family now have 3 smartphones, 2 laptops, 1 old desktop, 1 printer, 1 router and 1 NAS. The NAS can function as a print server and a file server. It is also a media centre, serving videos and audio.

Computer hardware manufacturing companies are always keen to describe old products with new names. They are always looking at ways to make their rather dull equipment seem more important than it actually is. Edge and cloud computing are two such names.

An edge computer, in dataspeak, is a local server device at the edge of the internet, hence the term. Many will find it difficult to distinguish an edge computer from any other server, because almost everything today is connected to the Internet. However, in years past, many local servers were only connected to local devices using a local area network (LAN) typically wired with Ethernet cable. There was no connection to the outside world.

The cloud, in dataspeak, refers to someone else’s server. Companies that offer cloud services to the public often claim that they do the heavy lifting, storing and safeguarding data. This is not always the case. Sometimes they lose data. Sometimes they lend data to others. They might even keep a copy of it, when you ask to have it back. The misuse of data held in trust, may have economic as well as other consequences. Adobe, Amazon and many other companies are very keen for consumers to visit a nearby cloud, and use software as a service. This is the most profitable for them. Using a cloud can be expensive.

For a short period, just after cloud computing came into vogue, it became fashionable to name non-cloud servers after bodies of water. Large businesses might refer to their in-house servers as lakes. Referring to a server at home as a lake verges on the pretentious. Modesty dictates referring to smaller bodies of water: a personal (puddle) server, a nuclear family (pool) server or an extended family (pond) server.

Fun Assignment: The reader is asked to distinguish a carbon based error from a silicon based error. Assistance with this problem can be found here.

Devices Future

Volkswagen and D-Wave Systems have used quantum computing to find optimal routes, as illustrated here in Lisbon, Portugal, and available as an app near you. (Photo: Volkswagen)

… and the answer is, everywhere.

Now for the question, where do people want to use computing devices?

Guestimations

After trying to collect and interpret validated statistics, I have given up and present some numbers than might approach something meaningful and coherent. Some are based on information collected by Simon Kemp, dated 2019-01-31. Other bits come from Wikipedia, such as this article, along with a variety of other places with assorted dates.

With a world population of 7.7 billion people, there are over 5 billion handheld devices, the vast majority also referred to as mobile phones, increasingly smartphones, although they do much more than connect people using voice communication. It would be much more honest to eliminate any reference to phone in the description. The German Handy or the French Portable, are both better. Other devices in this category include tablets, and similar devices lacking keyboards. Regardless, Android operating system variants clearly and increasingly dominate, with at least 75% of market share, with Apple’s iOS declining market share taking most of the remainder. It remains to be seen if Huawei will be able to introduce a viable alternative to Android.

There are two important characteristics that distinguish handheld devices from larger personal computers. They are the large screen size and the use of a keyboard input device. Minor differences also include the use of a mouse or some other pointer, They are often referred to as laptop and desktop machines. In terms of the world, this is small segment of machines compared to mobile devices, with its importance decreasing. Part of the reason for this decline is their inability to be used everywhere.

There is general agreement that the billionth personal computer shipped in 2002, and that there were one billion such computers in operation in 2008. The dispute is how many are in use now. Some are looking for a magic number of 2 billion, but 1.5 billion units is far more likely. Windows will be installed on at least 75% of machines, MacOS on, say, 13% (which to me seems high), ChromeOS on 6% (at least in the US, and higher than I experience in Norway) and Linux on 2%. The 2019 Stack Overflow developer survey gives very different figures on what is found on machines used by computing professionals. In round numbers: Windows on 45%, MacOS on 30%, and Linux on 25%.

Another category of computer is the embedded device. One essential aspect of these is the electronic control unit (ECU). Domotics refers to home robotics. It includes all aspects of smart home technology, including sensors that monitor the environment and actuators that activate controls. These include temperature, lighting and security. However, it is pervasive, found everywhere from electric toothbrushes, to toasters and every other form of kitchen machine. Today, even a lightbulb can be considered an ECU. A typical smarthouse may contain hundreds of these devices.

The vast number of ECUs expected, plus its vulnerability in terms of security, means that WiFi can only be a temporary solution. While communication can be built on top of 120/240 V AC circuits, most devices, including LED lights, actually use low voltage DC power. Anyone building something new should be installing Ethernet cable 6A at a minimum, with connections to every room. Power over Ethernet, (PoE) can then provide DC power to almost everything needed.

I expect clothing will soon include embedded devices, so that personal data can be continuously collected and monitored. In Sweden, I note that several individuals have voluntarily inserted RFID devices into their bodies, so that they can use these to identify themselves, rather than relying on PIN codes. Unfortunately, it is probably only a matter of time before these devices become mandatory.

Embedded devices are also found in cars where even the most primitive contain 25 – 35 ECUs. More luxurious models may have 70 or more ECUs. Hopefully, autonomous vehicles will soon be on streets near you. The last thing this world needs is a nut behind the wheel, especially one that feels provoked into road rage at the slightest offence. Electric vehicles are already here, with Tesla’s innovations leading the way. In Norway, there will be no opportunity for people to buy fossil fueled vehicles (including hybrids) after 2024. Everything will probably be battery electric, as an explosion at a hydrogen fueling station has dimmed everyone’s interest.

Command and control (C2) is defined by Marius Vassiliou, David S. Alberts and Jonathan R. Agre in C2 Re-Envisioned: the Future of the Enterprise (2015) as a “set of organizational and technical attributes and processes … [that] employs human, physical, and information resources to solve problems and accomplish missions.” (p. 1) This definition can apply to individuals, households, organizations, small businesses, large enterprises or even the military. One major challenge has been the tendency of large manufacturers of ECUs to consider just their own product range, and to make controllers for these and only these. This is not a viable solution. Our household has opted for the most inclusive solution, found in Home Assistant.

Miniaturization will continue into the future. I am uncertain about the future form factor of personal devices/ phones. Asked if they will shrink to wristwatch size or remain about the size they are today? Today’s form factor wins. Yes, one can imagine screen technology being built into glasses, or wrist watches, but will it happen? It will be interesting to see what has happened in 2040 and beyond.

In terms of PCs, they could be doomed to extinction. Physically smaller personal devices will be capable of doing everything PCs do. However, there may be situations where a person may want a larger screen, a keyboard and a pointing device. So the personal device will have to interact with these. I am not certain when voice control will replace the keyboard. When I first studied computing, in the mid-1970s, 1980 was even considered a target date for its replacement. However, that was based on people going from card punches to something else.

In terms of servers, one can also envisage a household having something the size of a small media centre, perhaps 100 x 100 x 50 mm (4″ x 4″ x 2″) which is about the size of our Asus PN 40 media player. At the current rate of miniaturization, it should be able to hold at least 100 TB by 2040. One could ask why anyone would need so much storage capacity, but today everyone seems capable of using every last byte of storage they have, and I see no reason for attitudes to change. Computers will be used in new areas because people have the processing power and data storage capacity to do it.

Perhaps the greatest change will come as quantum computing matures. Quantum computing is real. It allows computations to be made in seconds that would take a conventional supercomputer considerably longer. Google claims that its Sycamore processor with 54 Qubits, has achieved quantum supremacy, and is the most advanced quantum computing processor in the world, capable of processing in 200 s, what a Summit supercomputer would use 10 000 years to accomplish, making quantum computing 1 577 880 000 times faster. IBM has countered this, stating that it would only take 2.5 days, making quantum computing about 1 000 times faster. Regardless, quantum computing will provide faster calculations.

With my origins in Vancouver/ New Westminster, and with some of my most positive learning experiences at the British Columbia Institute of Technology, I will end this post by mentioning its Burnaby neighbour, D-Wave systems. They announced in 2019 their next-generation Pegasus quantum processor chip, the world’s most connected commercial quantum system, with 15 connections per qubit, and with more than 5000 qubits, to be available in mid-2020.

Devices Past

3D Rendering of computer center with IBM System/370-145 and IBM 2401 tape drives (Illustration: Oliver Obi)

In ancient times, computing meant batch systems that required users to drive across town to a computing centre, to punch their programs onto cards, then to submit those cards so they could be read by a card reader. An IBM 3505 Model B1 card reader from 1971 could read 80 column cards at the rate of 1200 CPM (cards per minute). It was based on the Hollerith Keyboard punch, from 1890. The programs were then run on a mainframe computer, such as an IBM System /370 dating from 1970. A machine consisted of several units housed in a large air-conditioned machine room with a raised floor to improve cooling, and conceal wiring. Processing took time, and results were provided an hour or two later, from high-speed printers, such as an IBM 3211, printing at about 150 lines per minute, more than enough to keep up with the punched card input. This was the basic situation from the mid-1950s until at least the mid-1970s, with variations.

The IBM System /370 Model 145 had 500 kB of RAM, 233 MB of hard disk space, and ran at 2.5 MHz. It cost from US$ 705 775 to US$ 1 783 000. The US Bureau of Labor Statistics consumer price index, states that US$ 1 in 1970 is worth US$ 6.63 in 2020. So that the IBM System /370 Model 145 would cost from about US$ 4.7 million to almost US$ 12 million in 2020 dollars.

Computers are a mix of hardware and software. Writing system software was a craft where a select few excelled. They wrote elegant but lean code that executed fast. In ancient times, when the hardware was primitive, craftsmanship mattered. Compilers and operating systems had to be written in assembly/ assembler language for increased efficiency and space savings. A programmer had to think like a processor, moving code into and out of registers. As computer hardware improved, the need to write parsimonious code gradually disappeared. Programmers started becoming verbose. Programming as a profession expanded far beyond the few.

To gain an understanding of the situation facing professional programmers, at this time, one of the best books to read is The Mythical Man-Month (1975) by Frederick Brooks (1931 – ). During Brooks’ exit interview with IBM’s legendary CEO Thomas Watson Jr. (1914 – 1993), a seed for the book was planted. Watson asked why it was harder to manage software projects than hardware projects. In this book the answer is stated, now known as Brooks’ law: “Adding manpower to a late software project makes it later.”

A 2020 Raspberry Pi 4 Model B is available with 1, 2 or 4 GB of RAM. That is anywhere from 2 to 8 000 times more than that found on the IBM machine in the previous paragraph. A 16 GB (or larger) SD card, contrasts with 233 MB of hard disk space. That is 68 times more. The speed of 1.5 GHz with 4 cores competes with 2.5 MHz, with a single core. Potentially there is a 2 400 times speed increase. More than anything else, with a RPi costing between US$ 35 and US$ 55, the IBM machine cost about 100 000 times more.

By the 1980s, card punches had given way to terminals, consisting of a screen (that frequently offered green text on a black background) and a keyboard. These were connected indirectly to a mini-computer, that replaced the mainframe. Digital Equipment Corporation were especially fond of using Ethernet cable to connect terminals to their VAX Mini-computers. Offices were starting to be interconnected. These machines still required their own machine rooms with adequate cooling, as well as the drive to the office.

To understand this new mini-machine period of computing, there is yet another book to read, The Soul of a New Machine (1981) by Tracy Kidder (1945 – ). Data General needs a machine to compete with Digital Equipment’s VAX, 32-bit computer. In South Carolina, they start project “Fountainhead”, where they divert almost all of their senior design personnel. A few remaining senior designers in Massachusetts are allegedly engaged in improving Data General’s existing products. However, Tom West (1939 – 2011), starts a skunkworks project, “Eagle”, that becomes a backup in case Fountainhead fails (which it does). It is a high risk project using new technology and misusing newly graduated engineering.

There are lots of candidates for declaring the first PC, as in personal computer. Personally, I opt for the 1973 Xerox Alto, since it offered both hardware and software that worked. Others may refer to the 1976 Apple II, 1977 Commodore PET 2001 or 1977 Radio Shack TRS-80 or even the 1981 IBM PC.

Most people were still using a terminal, rather than a PC, until about 1990. Terminals didn’t die when PCs arrived, because there was initially no easy way to connect a PC to the mini-computer. The two machine types had incompatible operating systems, MS-DOS on PCs, and a host of proprietary operating systems on the assorted mini-machines. Novell NetWare and Banyon Vines offered solutions, but these were weak and difficult to implement. Important data was stored and backed up on tapes, that required special readers located in a machine room. When PCs did finally connect to larger computers, the PC usually required an ethernet card, the entire building had to be wired for ethernet cables, and the name of the mini-computer was changed to server, that lived inside 19-inch racks with 1.75 inch rack-units, a system standardized by AT&T around 1922.

The other first PC, as in portable computer, today better known as a laptop, is a matter of debate. The Xerox Dynabook from 1972 was a fantastic machine, except for one fatal flaw – it was never actually built in hardware, only as a conceptual model. Most other early machines were either too heavy or were equipped with screens that were too small. This situation continued until 1985, when Toshiba finally produced the T1100, fairly accurately described as “the world’s first mass-market laptop computer”.

Both LANs (Local Area Networks) and WANs (Wide Area Networks) started interconnecting users in the early 1990s. The need for servers brought about a need for a standardized operating system. The first steps involved the use of different flavours of Unix, first developed in the 1970s at Bell Labs, along with the C programming language. The Unix modular design provides a set of simple tools that each performs a limited, well-defined task. It uses its unified filesystem as the primary means of communication, along with shell scripting.

A number of unfortunate issues related to the proprietary origins of Unix, led many to seek an open-source solution. It was found in the use of BSD (Berkeley Software Distribution) and the Linux kernel based operating system distributions, along with other related products that could be used freely. Linux was able to address a variety of different segments, servers, consumer desktops, laptops, tablets, phones and embedded devices. This is assisted by the modular design of the Unix model, which allowed the sharing of components.

Initially, home users had the choice of Windows or Apple operating systems. In the mid- to late 1990s, low-speed, dial-up modems allowed Internet access. People even started wiring their houses for the Internet, with Ethernet cables. However, most office and home computers were still beige boxes.

Fictional tablets first appeared in Stanley Kubrik’s (1928 – 1999) A Space Odyssey (1968). Real tablets first appeared in the last two decades of the 20th century. However, it wasn’t until 2010, when Apple released the iPad, that the tablet achieved widespread popularity.

Cell phones are often incorrectly referred to as mobile devices. They are more correctly handheld devices, even if they spend most of their times in assorted pockets and bags. It is the human with the bag or pocket that is mobile. On 1966-12-01, the first commercial cellular network (OLT) was launched in Norway. This was subsequently replaced, in 1981, with the Nordic Mobile Telephone (NMT) system, in operation in Denmark, Finland, Norway and Sweden. These used what would be regarded today as massive phones. Thus, the first personal data assistant (PDA) that could be accepted today as a handheld device, was the 1984 Psion Organizer, although PDA was not used as a term until 1992.

The 1996 Nokia 9000 Communicator, can be regarded as the first primitive smartphone. It was actually a hybrid combining PDA and conventional cell phone features. Canadians, especially, will want to date the smartphone to Research in Motion’s 2002 BlackBerry. The company founder, Mihal “Mike” Lazaridis (1961 – ) is in some ways the Canadian equivalent of Steve Jobs (1955 – 2011).

Corrections: Alasdair McLellan has corrected two errors. I had exaggerated the size difference of RAM between the IBM /System 370 and a Raspberry Pi, by a factor of 1 000. It is not 2 – 8 million times larger, but only 2 – 8 thousand times larger. The first commercial cellular network was not Japanese, but Norwegian. Documentation for it can be found in this Norwegian language source.

The Norwegian Wikipedia also writes about it, stating: Offentlig Landmobil Telefoni (OLT) var det første mobiltelefonnettet i Norge. Det ble opprettet 1. desember1966 og ble nedlagt i 1990 (stopp for nye konsesjoner var 1. november 1989). Ved intruduksjonen av NMT i 1981 var det ca. 22 000 abonnenter på OLT. OLT var manuelt, og ikke et automatisk mobilsystem, og regnes derfor som før “1G”.

Translation into English: Public Land Mobile Telephone (OLT) was the first mobile telephone network in Norway. It was established on December 1, 1966 and closed in 1990 (stopping for new licenses was November 1, 1989). At the introduction of NMT in 1981, there were approx. 22,000 OLT subscribers. The OLT was manual, and not an automatic mobile system, and is therefore considered as before “1G”.

Computing: The Series

Red lighted Keyboard (Photo: Taskin Ashiq, 2017)

In 2020, a series of weblog posts about computing devices will be written and published. The first in this series, about the end of support for Windows 7, was already published one week ago, on 2020-01-07.

Many people, do not know what types of devices will be advantageous for them to acquire. Even when they know the type of device, they do not understand how to evaluate that category in order to make appropriate purchases. Brand names and price become proxies for device quality. Unfortunately, this can result in inappropriate devices being selected. Not all devices need to be purchased new. Many older, even discarded, devices are often suitable for continued use, but may require the installation of different, more appropriate software.

This series consists of:

  1. Windows 7 (2020-01-07)
  2. Computing: The Series (2020-01-14)
  3. Devices Past (2020-01-21)
  4. Devices Future (2020-01-28)
  5. Clouds & Puddles (2020-02-04)
  6. Universal Serial Bus (2020-02-11)
  7. Video connectors (2020-02-18)
  8. Power supply/ charging (2020-02-25)
  9. Input & Output Peripherals (2020-03-03)
  10. Computer Access & Assistance (2020-03-10)
  11. External Drives (2020-03-17)
  12. Printers (2020-03-24)

Starting 2020-04-01, the focus at Cliff Cottage, will be on outdoor building construction. There will be limited time for blogging, with the exception of a single monthly update. Blogging will resume again 2020-10-06. There are several different categories of computing devices that most people may use/ acquire for work and leisure:

  1. Handheld devices (2020-10-06)
  2. Laptop & desktop devices (2020-10-13)
  3. Media players (2020-10-20)
  4. A Practical Server (2020-10-27)
  5. Vehicle devices (2020-11-03)
  6. Smart home devices (2020-11-10)
  7. Other embedded systems (2020-11-17)
  8. Sensory impairment (2020-11-24)
  9. Dexterity/ Mobility impairment (2020-12-01)
  10. Telemedicine (2020-12-08)
  11. Nightscout (2020-12-15)
  12. Computing: A Summary (2020-12-22)

Many of these will focus on the needs and limitations of older users, and how to mitigate the impact of various impairments.

Each topic, including publication dates, is subject to revision. People who want other topics covered can contact me via email: brock@mclellan.no

Update: On 2020-02-04 at 13: 40 Two of the topics on this post were changed. 09. Input Devices (2020-03-03) and 10. Output Devices (2020-03-10) were merged into 09. Input & Output Devices (2020-03-03), and a new topic 10. Computer Access & Assistance (2020-03-10) was created. On 2020-02-16 kl. 07:00 Input & Output Devices was changed to Input & Output Peripherals.

Update: On 2020-11-14 at 14:30: Originally, the schedule of weblog posts in this series was: 8. Visual impairment (2020-11-24); 9. Hearing impairment (2020-12-01); 10. Dexterity impairment (2020-12-08); 11. Mobility impairment (2020-12-15), and 12. Computing: A Summary (2020-12-22). This has been changed to the following: 8. Sensory impairing 2020-11-24) – which combines hearing and visual impairment; 9. Dexterity/ Mobility impairment (2020-12-01); 10. Telemedicine (2020-12-08) – which looks at telemedicine generally; 11. Tidepool (2020-12-15) – which examines automated insulin dosing. 12. Computing: A Summary (2020-12-22) remains the same.

Updaate: On 2020-12-13 at 12:30: This has changed the name of topic 11 from Tidepool to Nightscout. The reason for this is to focus more on the work of a social network, than a corporation.

Windows 7

Windows 7 (2009-07-22 – 2020-01-14) R.I.P.

This weblog post was written to discuss the situation facing people currently using Windows 7, and who will find themselves without support after 2020-01-14.

Summary: If you currently run Windows 7 your machine soon will be insecure. If you intend to keep older hardware, you will probably have to transition to a Linux distro because that hardware won’t be able to handle Windows 10. If you intend to upgrade the hardware, you may still prefer to transition to something like Linux Mint, because the learning curve will be gentler. Windows 7 is closer to Linux Mint, than it is to Windows 10.

If you are still running Windows 7, you are not alone. Many are in the same end-of-live situation. It is requiring many people to rethink their operating system strategy. Should they keep on using Windows 7, or upgrade to Windows 10, which may require a hardware investment, or should they keep the same hardware and go over to something like Linux Mint? Chris Barnatt has three videos where he discusses the challenges facing Windows 7 users. The first from 2018-08-26 deals with a transition to Linux Mint for Windows users. The second from 2019-11-17 is about hardware that allows for a quick change of drives between the two operating systems. The third from 2019-12-22 shows what Chris is planning to do. Yes, it includes keeping Windows 7, but disconnecting it from the internet. Don at Novaspirit Tech also has some insights. Heal My Tech provides an alternative view.

There are also a large number of other Linux distros that could be selected instead of Linux Mint, but there is only a short window of opportunity left to experiment and make a selection. Distros can be tested by running them as a virtual machine, or running them from a memory stick.

VirtualBox is open-source and can run on a Windows 7 host machine. Multiple guest operating systems can be loaded, started, paused and stopped independently within its own virtual machine.

I used Windows regularly because my employer insisted on it. This situation ended upon my retirement at the end of 2016. Looking back, Windows XP was my favourite Windows version, even though it only reaches warm on a scale ranging from cold to hot. My opinion of Windows 7 is that it ranks luke-warm, but definitely much warmer than the cold assigned to Windows 8 and 8.1 (the last version I was compelled to use).

Here is part of the Windows timeline. Certain versions are deliberately missing: Windows XP, codenamed Whistler – after the British Columbia mountain and resort, was released in 2001, supported until 2009 with extended support lasting until 2014. Windows 7 (Blackcomb – another BC mountain) was released in 2009, supported until 2015, with extended support ending this month, 2020-01-14. Windows 8 (no codename) was released in 2012, and version 8.1 (Blue) in 2013. Regular support ended in 2018, and extended support will end in 2023. Windows 10 (Redstone) was released in 2015, and is currently supported.

These days, irregular use of Windows 10 is required because another resident has a computer with Windows 10 installed, an Asus ZenBook laptop, and I seem to have some maintenance responsibilities. I have tried to find Linux equivalents for all of the software she uses regularly, but have not been able to port her library management system (BookCAT), and catalogue of almost 4 000 paper books to anything open-source. I find Windows 10 not just complex, but also confusing, and grade it frosty.

I have also used MacOS, in the 1990s when we owned up to several different Apple Macintoshes, then at school in the early 2000s, when I was teaching Media and Communication. Most recently, I used it when I inherited a MacBook Pro from my kind daughter, along with an iPhone 5S in 2015. I liked the operating system, and ranked it balmy, slightly above Windows XP. Yet, vendor lock-in prevents me from ranking it any hotter, and from going out and buying anything with an Apple label. Walt Mossberg provides an overview of developments at Apple, over the past forty years.

Regretfully, I am an experienced Acer ChromeBook 11 user. This system receives the grade of frozen, if only because it refused to play sound with Firefox. It also had several other faults, so that one year and one day after purchasing it, I gave it away. Chrome OS essentially functions as a host for Google’s other products where all files and folders are stored in Google’s cloud. It exists solely so that Google can offer software-as-a-service (SaaS). Its vendor lock-in is far worse than Apple’s. This insight partially rehabilitated Apple in my eyes.

Microsoft with Windows 10 is, in this respect, also imitating Chrome OS. Microsoft Office 365 provides text processing/ spreadsheet/ presentation and other programs with files stored in Microsoft’s cloud. This is one reason I won’t allow it on my computers. I don’t want my personal work spread indiscriminately throughout the world, and I would have no idea what Microsoft is doing with it, and no guarantee that it is behaving properly.

SSD vs HDD

A 10 MB HDD for USD 3 500 in 1980, was not an excessive price. The 1 MB of RAM on the Digital Equipment Corporation (DEC) VAX-11/750 mini computers I used cost over NOK 1 000 000 each in 1980. That is about USD 200 000 in 1980, or about USD 620 000 today (2019). The HDD pictured would cost over USD 10 000 today (2019) taking the value of money into account, which would make the cost of 1TB of storage equal to USD 1 000 000 000 today (2019). Yup, that’s one billion dollars!

SSD = Solid State Drive; HDD = Hard Disk Drive.

The Summary:

For daily operations on a desktop or laptop computer, SSDs are better (read: faster, quieter, more energy efficient, potentially more reliable) than HDDs. However, HDDs cost considerably (6.5 times) less than SSDs. Thus, HDDs are still viable for backup storage, and should be able to last at least five years. At the end of that time, it may be appropriate to go over to SSDs, if prices continue to fall.

The Details:

This weblog post is being written as I contemplate buying two more external hard disk drives (HDDs), one white and one blue. These will be yet more supplementary backup disks to duplicate storage on our Network Attached Storage (NAS) server, Mothership, which features 4 x 10 GB Toshiba N300 internal 3.5″ hard drives rotating at 7200 RPM. These were purchased 2018-12-27. While the NAS has its own backup allowing up to two HDDs to fail simultaneously, a fire or other catastrophe would void this backup. Thus, external HDDs are used to store data at a secret, yet secure location away from our residence.

The last time external hard disks were purchased was 2018-09-04. These were Western Digital (WD) My Passport 4TB units, 2.5″ form factor, rotating at 5 400 RPM, with a USB 3.0 contact. One was red (costing NOK 1 228) and the other was yellow (at NOK 1 205). However, we have nine other 2 – 4TB units, some dating from 2012-11-15. Before this we had at least 4 units with storage of 230 GB – 1 TB, dating to 2007-09-01. (We are missing emails before 2006, so this is uncertain territory, although if this information were required, we have paper copies of receipts that date back to 1980).

The price of new WD My Passport HDD 4TB units has fallen to NOK 1 143. New WD My Passport Solid State Drive (SSD) units cost NOK 2 152 for 1TB, or NOK 3 711 for 2TB. That is a TB price of about NOK 1 855, in contrast to about NOK 286 for a HDD. This makes SSDs about 6.5 times more expensive than HDDs.

I am expecting to replace the disks in the NAS, as well as on the external drives, about once every five years. Depending on how fast the price of SSDs sink in relation to HDDs, these proposed external HDDs could be the last ones purchased.

As the price differential narrows, other disk characteristics become more important. Read/write speed is especially important for operational (as distinct to backup) drives. Typically, a 7200 RPM HDD delivers an effective read/write speed of 80-160MB/s, while an SSD will deliver from 200 MB/s to 550 MB/s. Here the SSD is the clear winner, by a factor of about three.

Both SSD drives and HDD’s have their advantages and disadvantages when it comes to life span.

While SSDs have no moving parts, they don’t necessarily last longer. Most SSD manufacturers use non-volatile NAND flash memory in the construction of their SSDs. These are cheaper than comparable DRAM units, and retain data even in the absence of electrical power. However, NAND cells degrade with every write (referred to as program, in technical circles). An SSD exposed to fewer writes will last longer than an SSD with more. If a specific block is written to and erased repeatedly, that block would wear out before other blocks used less extensively, prematurely ending the SSD’s life. For this reason, SSD controllers use wear levelling to distribute writes as evenly as possible. This fact was brought home yesterday, with an attempt to install Linux Mint from a memory stick on a new laptop. It turned out that the some areas of the memory stick were worn out, and the devise could not be read as a boot drive. Almost our entire collection of memory sticks will be reformatted, and then recycled, a polite term for trashed!

Flash memory was invented in 1980, and was commercialized by Toshiba in 1987. SanDisk (then SunDisk) patented a flash-memory based SSD in 1989, and started shipping products in 1991. SSDs come in several different varieties, with Triple Level Cells (TLC) = 3 bit cells offering 8 states, and between 500 and 2 000 program/ erase (PE) cycles, currently, the most common variety. Quad Level Cells (QLC) = 4 bit cells offering 16 states, with between 300 and 1 000 PE cycles, are starting to come onto the market. However, there are also Single Level Cells (SLC) = 1 bit cells offering 2 states, with up to 100 000 PE cycles and Multi-Level Cells (MLC) = two level cells with 2 bits, offering 4 states, and up to 3 000 PE cycles. More bits/cell results in reduced speed and durability, but larger storage capacity.

QLC vs TLC Comparisons:

Samsung 860 EVO SSDs use TLCs while Samsung 860 QVO SSDs use QLCs. The 1TB price is NOK 1 645 (EVO) vs 1 253 (QVO), almost a 25% price discount. The EVO offers a 5-year or 600 TBs written (TBW) limited warranty, vs the QVO’s offers 3-years or 360 TBW.

With real-world durability of the QVO at only 60% of the EVO, the EVO offers greater value for money.

It should also be pointed out that both the EVO and QVO have a 42GB cache that allow for exceptionally fast writes up to that limit, but slow down considerably once that limit has been reached.

In contrast to SSDs, HDDs rely on moving parts for the drive to function. Moving parts include one or more platters, a spindle, an read/ write head, an actuator arm, an actuator axis and an actuator. Because of this, an SSD is probably more reliable than an HDD. Yet, HDD data recovery is better, if it is ever needed. Several different data recovery technologies are available.

The Conclusion:

The upcoming purchases of two My Passport 4TB external HDDs may be my last, before going over to SSDs for backup purposes, both on internal as well as external drives. Much will depend on the relative cost of 10TB SSDs vs HDDs in 2023, when it will be time to replace the Toshiba N300 10TB HDDs.

For further information on EVOs and QVOs see Explaining Computers: QLC vs TLC SSDs; Samsung QVO and EVO.

Cut/Copy and Paste

The most influential computer ever made, original Xerox Alto featuring bit-mapped black and white display sized 606×808 (the same dimensions as a regular 8.5″x11″ sheet of paper, aligned vertically; 5.8 MHz CPU; 128kB of memory (at the cost of $4000); 2.5MB removable cartridge hard drive; three button mouse; 64-key keyboard and a 5-finger key set. It was on such a machine that Bravo and Gypsy were developed, and cut/copy and paste invented. (Photo: Xerox PARC)

Larry Tesler (1945 – ), invented cut/copy and paste. Between 1973 and 1976, Tesler worked at Xerox PARC (Palo Alto Research Center), in Palo Alto, California, on the programming language Smalltalk-76, and especially the Gypsy text editor, referred to then as a document preparation system. It was on this project, he implemented a method of capturing text and inserting it elsewhere.

Xerox PARC was initiated by Xerox Chief Scientist Jacob E. “Jack” Goldman (1921 – 2011) who previously worked at Carnegie Tech and directed the Ford Scientific Laboratory, who hired a physicist, George Pake (1924 – 2004) to create it in 1970.

Xerox PARC was largely responsible for developing laser printing, the Ethernet, the modern personal computer, the graphical user interface (GUI) and desktop paradigm, object-oriented programming, ubiquitous computing, electronic paper, amorphous silicon (a-Si) applications, and advancing very-large-scale integration (VLSI) for semiconductors.

For a more complete story, see: Larry Tesler, A Personal History of Modeless Text Editing and Cut/Copy-Paste (2012)

While most people focus on the cut/copy-paste tool, the concept of modeless software had even greater impact. A mode is a distinct setting within a computer program, in which the same user input will produce different results, because of other settings. Caps lock when pressed puts the user’s typing into a different mode, CAPITAL LETTERS. If it is pressed a second time, the original made will be reactivated, resulting in lower-case letters.

Most interface modes are discouraged because of their potential to induce errors especially when the user is expected to remember the mode state the interface is in. The situation is somewhat better if there is an on-screen state/ mode indicator, such as a change in the colour of an icon, when a mode change is made.

If the user is unaware of an interface mode, there may be an unexpected and undesired response. Mode errors can be disorienting as the user copes with a transgression of user expectations. Not all mode changes are initiated by users,

Mode changes can be initiated by the system, by previous users or by the same user who has disremembered the state change. In such a situation, an operation with the old mode in mind, will disrupt user focus as the user becomes aware of the mode change. This is especially important when a user cannot find how to restore the previous mode.

Prior to Gypsy, Butler Lampson (1943 – ), Charles Simonyi (1948 – ) and others developed Bravo at Xerox PARC in 1974. It was a modal editor where characters typed on the keyboard were usually commands to Bravo, except when in “insert” or “append” mode. Bravo used a mouse to mark text locations and to select text, but not for commands.

Although similar in capabilities to Bravo, the user interface of Gypsy was radically different. In both, a command operated on the current selection. But Bravo had modes and Gypsy didn’t. In Bravo, the effect of pressing a character key depended on the current mode, while in Gypsy, pressing a character key by itself always typed the character.

In the Wikipedia article on Gypsy, the difference between Bravo and Gypsy is illustrated by three examples:

  1. Insert In Bravo’s Command Mode, pressing “I” entered Insert Mode. In that mode, pressing character keys typed characters into a holding area (“buffer”) until the Escape key was pressed, at which time the buffer contents were inserted before the selection and the editor returned to Command Mode.
    In Gypsy, no command or buffer was needed to insert new text. The user simply selected an insertion point with the mouse and typed the new text. Each inserted character went directly into the document at the insertion point, which was automatically repositioned after the new character.
  2. Replace In Bravo, to replace existing text by new text, the user pressed “R” to enter Replace Mode. That mode was just like Insert Mode except that the buffer contents replaced the selection instead of inserting text before it.
    In Gypsy, to replace text, the user simply selected the old text and typed the new text. As soon as the user began to type, Gypsy deleted the old text and selected an insertion point in its stead.
  3. Copy In the then-current version of Bravo, the user selected the destination, pressed “I” or “R” to enter Insert or Replace Mode, selected the source (which highlighted differently from the destination), and pressed Escape to perform the copy and return to Command Mode. While in Insert or Replace Mode, the user could scroll and could select a source, but could not invoke another command, such as opening a different document. To copy text between documents was more complex.
    In Gypsy, the user could select the source text, press the “Copy” function key, select the destination text or insertion point, and press the “Paste” function key. Between Copy and Paste, the system was, as usual, not in a mode. The user could invoke other commands, such as opening a different document.

Fewer modes meant less user confusion about what mode the system was in and therefore what effect a particular key press would have. Gypsy and Bravo both used a three-button mouse, where the second and third buttons were intended for experts.

New users could learn to work with Gypsy in only a few hours. Drag-through selection, double-click and cut-copy-paste were quickly adopted elsewhere, and have become standard on most text editors.

This text was originally written in June 2009 as a draft for a weblog post. It was removed from the weblog, but subsequently revived without the original date and time stamps. New text was added at irregular intervals, including 13 May 2016, 23 April 2018, and 06 May 2019. The publication date of this weblog post celebrates the 10th anniversary of this weblog.

The Charm of the Demoscene

A Commodore Amiga 2000 with 3.5 inch floppy drive, 20 MB hard drive, keyboard and mouse. A cathode Ray Tube (CRT) monitor is missing. (Photo: Trafalgarcircle

Imagine home computing in the late 1970s. Machines are weak. Software is unrefined. Popular models include Apple II and its clones, ZX Spectrum, Commodore 64 and Amstrad CPC. The IBM PC, and its clones, have not yet arrived.

I remember a friend showing off his Apple II. It would show a line of text, Name? followed by a blinking cursor. When I typed in my name, and pressed return, it would respond by writing: Hello, Brock! It was easy to be impressed by technology in the late 1970s.

Inspiration for today’s demoscene first came in 1980, when Atari used a looping demo with visual effects and music to show off the features of the Atari 400/800 computers.

Demoscene is a type of computer art, that will be described in more detail later in this post, and in chronological order. It has a darker past, but a lighter present. In this weblog post, many of the terms used will be defined. It is an artform that generally avoids mainstream exposure. According to some sources, about 10 000 people are involved with it.

Cracker = a programmer who alters video game code to remove copy protection. Cracking crew is used where more than one person is involved in the cracking process.

Cractro = (crack intro) an introductory screen used by a cracker/ cracking crew to claim credit for cracking a game. They became very complex a medium to demonstrate superior programming skills, advertise BBSes, greet friends, snub rivals and gain recognition.

More important in Europe, than in other parts of the world, the cractro transmutes into the demo. A cracker community emerges then evolves into an entity independent of gaming and software sharing.

New machines are better suited to support the scene, most specifically the Commodore Amiga and the Atari ST. Some IBM clones are acceptable, if they have sound cards. Not the Apple Macintosh.

More inspiration came in 1985 when Atari demonstrated its latest 8-bit computers with a demo that alternated between a 3D walking robot and a flying spaceship.

That same year, Amiga released a signature demo showing the hardware capability of its Amiga machine, with a large, spinning, checkered ball that cast a translucent shadow.

Demo = a self-contained, originally extremely small, computer program that produces an audio-visual presentation. Its purpose is to demonstrate the programming, visual art and musical skill of its producer.

Demoparty = a festival where demos are produced, after a day or weekend long coding marathon, then presented, voted on by attendees, then released, originally on floppy disks and on bulletin board services (BBS).

Compo = a demoparty competition, traditionally divided into categories where submissions must adhere to certain restrictions: production on a specific type of computer, or a maximum data size. Submissions are almost always rendered in real time. This contrasts with animated movies, which simply record the result of a long and intensive rendering. The purpose of a compo is to push computing hardware to its limits.

Demoscene = computer art subculture focused on producing demos, international in scope.

Demoscener = a computer artist focused on technically challenging aesthetics, but with a final product that is visually and aurally pleasing.

Demogroup = a small, tightly-knit group of demosceners, centered around a coder/ programmer, a musician and a graphician. Some groups may have supporting roles and grow to tens of people, but this is the exception. Demogroups always have names. Individuals within the group have unique handles for self-expression. Demogroups use wordmarks, logos, catchphrases and slogans. They are skilled at public relations and even human resource management. The demogroup is undoubtedly the most important social unit in the demoscene.

While belonging to a group is often synonymous to being a demoscener, there are individual productions. Not infrequently, this individual will adopt a group name. There are also fake groups, involving secret identities for making humorous, political or vulgar productions without harming the reputation of the original group. Individuals invent new handles, or pseudo-pseudonyms.

There used to be an American demoscene, but it barely exists today. Who killed the American demoscene? The simple answer is the American crackdown on software piracy. European copyright law only criminalized for-profit breaches. In many European countries, including the Netherlands, Greece, Finland, Sweden and Norway, it was possible for the cracker to repent and to transform into a law-abiding demoscener.

The Amiga 2000

Our first family computer was a Commodore Amiga 1000, on loan to us while we waited for our Amiga 2000 to arrive, which it did some weeks later. In 1986/ 7, these were the best residential computers money could buy. If I remember correctly, the Amiga 2000 cost NOK 19 000 (a little over US$ 2 000 then or about US$ 4 000 in 2019.)

We bought the Amiga while living in Bodø, in Northern Norway. The company that sold it consisted of two young male idealists, who were among the most active Amiga enthusiasts in the country. In addition to selling machines, they developed software and also published a Norwegian language Amiga magazine. Some of my work appeared there. They had the largest collection of 3.5 inch Amiga floppy disks in Norway, which contained software and content on every conceivable topic. They made cractros.

The Amiga 2000 was an advanced machine. Some even claimed at the time that it would last into the 21st century. In contrast to the Amiga 1000, it allowed expansion cards to be added internally: SCSI host adapters, memory cards, CPU cards, network cards, graphics cards, serial port cards, and PC compatibility cards were available. We used a SCSI adapter with a hard drive, and a PC card, that allowed us to run both Amiga and PC-DOS programs. The Amiga 2000 also had five Zorro II card slots, the motherboard also has four PC ISA slots, two of which are inline with Zorro II slots for use with the A2088 bridgeboard, which provided IBM PC XT compatibility.

There were about 4 850 000 Amiga machines of all types sold. The machines were most popular in the United Kingdom and Germany, with about 1.5 million sold in each country. Sales in the high hundreds of thousands were made in other European nations. The machine was less popular in North America, where only about 700 000 were sold

Beer

I need read no further than the first word of an announcement for FOSDEM (Free and Open Source Developers’ European Meeting) to see that it is not an event for me. It is not that I fear that my non-drinking won’t be tolerated. Rather, I don’t want to have to put up with inebriated or half inebriated people.

Far too often in my working life, I have had to attend events populated by people who didn’t know the limits of propriety, after consuming alcohol.

Even though I share many of the aspirations of FOSDEM, this ad clearly demonstrates that this non-commercial, volunteer-based free and open-source software development community, is not sufficiently mature to warrant my attention, at least at this annual event held at the Université Libre de Bruxelles since 2000.

Alcohol use is a health concern, and with so much abuse in society it should not be promoted.

Good Enough Websites

Websites have many uses. Perhaps two of the more important, involve the sharing of information, through emails and web logging, or blogging. The first email was sent by Ray Tomlinson (1941-2016) in 1971. He is quoted as saying that these first “test messages were entirely forgettable and I have, therefore, forgotten them.” The first web log post is 25 years old, this week. It was published on 1994-01-27 by Justin Hall. Here is a link to it:

links.net/vita/web/start/original

Statistics are hard to come by, but it seems at least half a billion people have their own web logs. On 2017-09-14, one blogger reported 440 million blogs just on Tumblr, Squarespace and WordPress. Most of these people, including myself, don’t have either the interest or the skills to set up a website that follows best practices. I am not sure that they even want to, I don’t. Instead, they want something that is simple but good enough for the needs of themselves and their families.

Why stress to obtain the best, when good enough will do? Patmos, Farm vehicle near Chora, Greece. Photo: Brock McLellan/ Digitization: Patricia McLellan. 1979-07-27.

Having gained some experience through work, with Moodle, a Learning Management System, and being dissatisfied with a couple of web hosting providers that were supposed to support this product, I opted for one.com as a host, on the advice of someone I trust. The first lesson, then, is to ask for help from 1) someone who has experience with family oriented websites, 2) is trustworthy and 3) knows you, your family and your situation.

The proposed solution, which was implemented, may not be the world’s best hosting service, but it is certainly adequate, inexpensive and good enough for my family’s purposes. No issues have arisen during the past year that make me want to change vendor.

We purchased, or more correctly rent on an annual basis, a domain based on our family name, which provides email addresses for members of our family, but they are not in active use by everyone. We also paid for “Starter” web hosting services on a server.

Like most web hosts, our provider tries to make customers feel that they are getting a lot, or at least something, for their $3 a month in hosting fees. They try to impress with a content list that includes ten items: Unlimited bandwidth; Email on your own domain; Unlimited email accounts; Unlimited email aliases; Spam & Virus Protection; Fully featured professional webmail; Individual Calendar & Address Book per email account; IMAP/POP; Single domain; and, 25 GB SSD Storage. While I have a theoretical interest in some of these services, including spam and virus protection, the main product being purchased is storage space. This storage is being used for emails, as well as web logs.

Now, a second domain name has been purchased/ rented for a family member with a different surname. This has necessitated an upgrade to a “Professional Plus” web hosting service, offering hosting of multiple domains, eight times more storage (200 GB) as well as Backup & Restore facilities. Above this there is a “Business” level that offers 500 GB of storage, as the only significant difference.

Sometimes an upgrade is necessary. But that does not mean it has to be expensive. Honda Van, Exeter, England. Photo: Brock McLellan/ Digitization: Patricia McLellan. 1979-06-21.

An aside: Wouldn’t it be wonderful if names didn’t have to have elitist attributes? Why not name products after winds? The breeze, the gale and the hurricane. Or birds? The crake, the coot and the crane. I would even accept the apprentice, the journeyman and the master, or even a simple level 1, 2 and 3.

The needs of most families are relatively simple. They are not running businesses that need complex e-commerce solutions, with marketing and sales support, traffic management and guaranteed up-time. Everyone finds downtime detrimental, but it is something that can be lived with.

So, one of the first questions to ask is: Why not just use Gmail/ Hotmail/ Outlook/ Yahoo? Yes, some of these offer lots of storage space, spam and virus protection, and much more. Google offers 15 GB of storage for each user, Yahoo offers 1 TB. Personally, I am not using more than 10 GB, for email and web log, and other family members are using considerably less.

Similarly, one can ask: Why not just use Facebook to post information/ opinions that would otherwise end up in a web log?

The main reason to avoid multi-national corporations, is to protect families from the effects of long-term exploitation. These corporations are mining data and monetizing it. Yes, that is a big word, and it means they are making money off of your data. In the long-term, this will make you and your family poorer and less secure, while the elites grow richer. By using your own website, you will prevent these corporations from accessing the data they need to manipulate consumers and voters. These corporations, and a few others, are instruments effectively used by an elite, to consolidate their power.

Another important reason for having a family domain is for blogging. Roger McNamee, an early investor in Facebook, has written that information and disinformation look the same; the only difference is that disinformation generates more revenue, so it gets better treatment, at Facebook or Google. He claims that there is no way for these giants to avoid influencing the lives of users and the future of nations. Recent history suggests that this threat to democracy is real. McNamee proposes fundamental changes to their business models to reduce the harm they cause to democracy

The rest of humanity cannot wait for these enterprise Titanics to turn in an attempt to avoid icebergs of dictatorship and oppression. People must take control of their own lives back again, to the degree that this is possible. This means reducing our presence on Google, Facebook and Twitter, and increasing our presence on our own personal websites.

Blog is short for weblog, an online journal or informational website displaying information in posts, generally accessed in reverse chronological order. Some blogging platforms are run by the Titan(ic)s. Blogger (previously called Blogspot) is owned by Google. Tumblr is owned by Verizon. Instagram is not so much a blog, as a photo and video-sharing social networking service owned by Facebook. Two open source platforms are WordPress and Joomla.

Joomla is powerful and flexible enough to be used to build any kind of website or blog. There are enough templates to choose from, to customize any site. Extensions add more features. Yet, because Joomla has a shorter reach than WordPress, there are fewer themes and addons, and less support. Backups, updates, and security take more work.

WordPress provides sufficient control website, and allows one to add extra features like forums and an online store, if that is the direction of travel. Website management can have its challenges, especially things like backups and security. Despite some imperfections, WordPress is the platform used on Brock at Cliff Cottage. Personally, I do not see any advantages to throwing away my insights with this platform, just to select a different platform that will require more time to learn.

Video bloggers are called vloggers. Many choose to upload their videos to YouTube, another Google owned site. Here, one creates a free and simple vlogging channel, with an existing audience close by. Other websites for video content include: blip.tv, vimeo and veoh. However, there is nothing to prevent a vlogger from using their own site to host their own videos. This, in fact, is in the spirit of this weblog post. WordPress offers several plugins especially designed for vlogging.

Some web logs are very specialized: auto repair, cooking, fashion, music, Norse mythology and robotics come to mind. In addition to vloggers, there are podcasters, who make web logs featuring audio tracks. Some people create portfolios of their work. Others just want a place to display their photographs, or their paintings/ drawings/ etchings. It is all up to the individual. Artists and artisans may want to upgrade a website for business purposes, including the display and sale of merchandise. It is relatively easy to build out WordPress with plugins to accommodate new needs.

WordPress can be updated using plugins, into a website for e-commerce. Three-wheeled electric milk van, Exeter. Photo: Brock McLellan/ Digitization: Patricia McLellan. 1979-06-22.

There are several WordPress books for beginners. The one I prefer is: Michal Bradek 2017 WordPress Guide for Beginners. The only challenge with this book is that it is based on WordPress version 4.8. Version 5.0 was released 2018-11-19, and is becoming standard. The greatest change with this update is the Gutenberg editor, which is actually easier to use than the previous “classic” editor, but is different – so some skills have to be unlearned, and others learned.

V2: Minor corrections made 2019-01-24 19:47