The terms disability/ handicap/ impairment/ incapacity/ infirmity/ invalidity and more, all refer to an inability to do something. Some of the words show greater sensitivity to the person involved, than others. There are lots of things no-one can do, such as turning one’s head 360 or even 180 degrees. This is not a disability because it is not in the human repertoire. There is a developmental curve in which infants transform themselves into adults. A child’s inability to do something is not a disability. At the other end of the age spectrum, people lose some of the abilities they once had. They are genetically programmed to develop disabilities. Yet, there are some abilities even for their age group, that most people have, but a few lack. Most people have an ability to distinguish red from green, but some are colour-blind.
Impairments may be congenital or originate from disease/ trauma. These can be placed in various categories. Cognitive impairments (head injury, autism, developmental disabilities) and learning disabilities, (dyslexia, dyscalculia, ADHD). Visual impairment (low-vision, complete or partial blindness, colour blindness). Hearing impairment (deafness, reduced hearing, hyperacusis). Motor/ dexterity impairment (paralysis, cerebral palsy, dyspraxia, carpal tunnel syndrome, repetitive strain injury).
One interesting place to begin an exploration of this topic is with the 84 minute long, 1991 Japanese animated science fiction action comedy thriller, Roujin Z (Old Man Z). This should provide an opportunity for people to decide what they want to avoid in terms of technology. According to British film commentator and sometime Vancouver International Film Festival programmer Tony Rayns (1948 – ), the film focuses on three primary issues: health care for the elderly, the stand-off between traditional values and modern technology and the Right’s covert plans to re-militarise Japan. (Rayns, Tony (1994). “Rojin Z/Roujin Z” in Sight & Sound. Vol. 4 no. 7. British Film Institute. pp. 52–53.) Only the third topic is uninteresting in terms of this post’s content.
Access, assistive and adaptive technology are three levels of technology. Accessible usually refers to specialized but low-level hardware or software features (or both) that help someone mitigate an impairment. It most often involves computing equipment that can be used by anyone, regardless of impairment type or severity.
Accessibility features such as text-to-speech, closed-captioning, and keyboard shortcuts make the use of computer technology less challenging for those with impairments. It also benefits those who are not sufficiently impaired to be considered debilitated. Basic tools are often provided by the operating system being used.
Visual aids include high contrast text, large text and screen readers in addition to desktop zoom features. In terms of keyboards, some people may prefer to use on-screen keyboards, or have visual or audio indicators on caps and numerical locks. Visual and audio alerts can also be used to provide event feedback, for typing assistance. Typing assistance may using sticky keys, that is, it may treat a sequence of modifier keys as a combination; slow keys, where there is a delay between when a key is pressed and when it is accepted; bounce keys, which ignores fast duplicate keypresses. For mouse keys, a keypad can be used to control the pointer; a secondary click can be simulated by holding down the primary key; a click can be triggered when a pointer hovers.
In addition to operating system features, there are many different apps available for both computers and handheld devices. eSpeak is a compact open source software speech synthesizer for English and other languages, for Mac, Linux and Windows. Formant synthesis allows it to work in many languages despite its small size. The speech is clear, and can be used at high speeds, but it is not natural or smooth, in contrast to larger synthesizers that rely on recordings of human speech.
Assistive refers to more complex but standardized/ off the shelf objects or systems that are used to help individuals cope with their impairments.
Systems can remind people about turning off an oven or burners on a stove, closing the blinds and locking doors. While users can choose what events are to be reminded, many systems can be expanded to automatically perform these tasks or to turn on or off lights, or to adjust room temperatures. They can also remind users to take their medications. Reminder systems can be based on a handheld or wrist worn device. They can remind people about doctor’s appointments and taking medicine,
Another basic device is remote keyless entry systems which allows people to view who is at the door and then, should they choose, remotely open the door. The system can also lock doors and shut blinds to maintain privacy.
Automated pill dispensers can dispense only the pills that are to be taken at that time. Monitors are available to check their blood sugar levels, measure body temperature, blood pressure and pulse, and to dispense appropriate medications as needed, or to alert others. Robots can provide patients with medication and/ or nutrition.
Adaptive technology is specifically designed for people with disabilities. Sometimes, adaptive technology is regarded as a subset of assistive technology, referring specifically to electronic and information technology that are seldom used by a non-disabled person.
There seem to be two categories of space exploration corporations, winners like SpaceX, and those unable to win, like Boeing. For the past few years, the reputation of Boeing has been slipping, in large part from its inability to manage, design and manufacture high quality, technological products. This was examined in a previous post, Clowns Supervised by Monkeys, about the Boeing 737-MAX, and continues here, about the CST-100 Starliner.
The Commercial Crew Development (CCDev) program, is a human spaceflight program funded by the American government and administered by the National Aeronautics and Space Administration (NASA). The goal of CCDev is to fly US and international astronauts to the International Space Station (ISS) on privately operated crew vehicles.
The CCDev program started in 2010, with CCDev 1 providing $50 million to five US companies to develop human spaceflight concepts and technologies. This was followed up in 2011, with CCDev 2 providing $270 million to four companies for developing vehicles that could fly astronauts after the Space Shuttle fleet’s retirement. In 2014 operational contracts to fly astronauts were awarded to SpaceX and Boeing.
CCDev 3 was renamed Commercial Crew integrated Capability (CCiCap). This involved proposals for end-to-end operational concepts including spacecraft, launch vehicles, launch services, ground and mission operations, and recovery. A final request for proposals was released on 2012-02-07 to be submitted by 2012-03-23. Three proposals were selected, and announced 2012-08-03: Sierra Nevada Corporation was awarded $212.5 million for its Dream Chaser/ Atlas V proposal; SpaceX was awarded $440 million for its Dragon 2/ Falcon 9 proposal; and, Boeing was awarded $460 million for its CST-100/ Atlas V proposal.
Certification Products Contract, phase 1 (CPC 1) involved the development of a certification plan with engineering standards, tests, and analyses. Sierra Nevada, SpaceX and Boeing were each awarded about $10 million.
CPC 2 was renamed the Commercial Crew Transportation Capability (CCtCap) and included the final development, testing and verifications to allow crewed demonstration flights to the ISS.
On 2014-09-16, Boeing and SpaceX received contracts to provide crewed launch services to the ISS. Boeing received a potential $4.2 billion, and SpaceX up to $2.6 billion.
On 2019-11-14, NASA’s inspector general reported a seat price of $90 million for Starliner and $55 million for Dragon Crew. Boeing’s price exceeds the $80 million paid by NASA to the Russian space corporation, Roscosmos, for Soyuz spacecraft seats to fly astronauts to the ISS. The report also stated that NASA agreed to pay an additional $287.2 million above Boeing’s fixed prices. Similar compensation was not offered to SpaceX.
While the first CCDev flight was planned for 2015, insufficient funding and technical issues caused delays.
Dragon 2 Pad abort test
Dragon 2 Uncrewed orbital flight test
CST-100 Pad abort test
CST-100 Uncrewed orbital flight test
Dragon 2 In-flight abort test
Dragon 2 Crewed test flight
CST-100 Crewed test flight
The CCDev was aimed to minimize development costs through private investment and development, by using two space transportation vehicles competing with each other. NASA had hoped this approach would provide redundancy both in regards to development and flight operations.
After completing the demonstration flights, each company is contracted to supply six flights to ISS between 2019 and 2024.
As shown in the table above SpaceX has successfully tested its Dragon 2, Crew Dragon, despite a delay caused by a ground test failure, caused by a leaky valve.
On 2019-12-20 an unmanned Boeing CST-100 Starliner space taxi malfunctioned on the capsule’s first mission, an Orbital Flight Test (OFT). The initial failure was due to a timer fault. Now, another error has been found in the capsule’s software which could have destroyed the Starliner. If this second fault had not been discovered, it could have resulted in the deaths of the astronauts onboard a Starliner.
The CST-100 has had greater problems. Its abort test, while successful, still had a parachute fail during descent. Its service module leaked toxic fuel, delaying its uncrewed OFT by months. The OFT was supposed to be one of the last steps in Boeing’s development of the CST-100 Starliner. When that test finally happened, a timer failure prevented a rendezvous with the ISS. It failed its mission. Then it was discovered that there were other issues including inappropriate thruster firings, inappropriate valve mappings, potential collision issues between the service module and the crew module, as well as space-to-ground communication issues.
Doug Loverro, the head of NASA’s human spaceflight section, stated that the software anomalies were “likely only symptoms…we had numerous process escapes in the design, development, [and] test cycle for software…We have a more fundamental problem…”
Boeing is failing in its ability to deliver mission critical software, not only in spacecraft but with the disastrous software failures with the Maneuvering Characteristics Augmentation System (MCAS) system on the Boeing 737.
NASA administrator Jim Bridenstine held a media teleconference detailing some of the CST-100 issues, explaining it in the “interest of transparency”, thanks to the OFT having “lots of anomalies”.
Given a choice of being an astronaut with SpaceX or Boeing, there is no doubt that every rational person would opt for SpaceX. Boeing is just too dangerous a company.
Handheld devices and laptops do not need to attach input & output peripherals to operate. This is one attribute that makes them extremely portable and popular. Yet, neither portability nor popularity, in themselves, make them intrinsically better than less portable devices. Each device has to be examined in context of its intended uses. Unfortunately, Many portable devices make few concession to ergonomics.
This weblog post looks at peripherals that provide a better working situation for most people using a desktop device. Often, the same peripherals can be plugged into a laptop or a tablet. Advice on software to improve access or provide assistance will be provided next week (2020-03-10). Advice for people with specific impairments (vision, hearing, dexterity and/ or mobility) will be given in four weblog posts scheduled from 2020-11-24 to 2020-12-15.
The influence of the typewriter on the development of the computer keyboard cannot be underestimated. Yet, finding the typewriter’s original inventor is difficult. Journalist Joan Acocella (1945 – ) once estimated that significant contributions have been made to its development at least 52 times through history. However, the name typewriter began to be used in 1873 with the production of a Remington machine with a QWERTY keyboard.
Yet, the typewriter is not the only source of keyboard inspiration. The 1846 invention of the teleprinter/ teletype/ TTY by Royal Earl House (1814 – 1895) was also important, even if the 28 keys looked as if they belonged on a piano. A more typewriter-like keyboard for a teleprinter was invented in 1901 by Donald Murray (1865–1945). That same year Herman Hollerith (1860 – 1929) invented a keypunch, a machine that punches holes in paper.
The QWERTY keyboard was used because it was inefficient, making the machine less susceptible to jamming. There are keyboards formats that are faster. In 1936 Seattle educational psychologist August Dvorak (1894 – 1975) and William Dealey (? – ? ) patented the Dvorak keyboard, claiming it required less finger motion to write English, reduced errors, reduced repetitive strain injuries, increased typing speed and made typing more comfortable. People interested in learning why Dvorak is the better system are encouraged to read the Wikipedia article.
Part of the challenge of finding ergonomic keyboards is that some people may need additional features that only comes with software support. This means that if your specific host device is not configured for supported software, such users may not be able to use it optimally. Fortunately, in most cases, the keyboard will still work. For example, at the moment I use a Logitech K380 keyboard ($40) with my desktop machine, rather than a more ergonomic but more expensive ($130) Ergo K860 keyboard or the cheaper ($60) Ergo K350 model. I have found that the K380 fits my hands optimally.
Other people should examine the Ergo K860 to decide which elements of the keyboard, if any, they need. It should be noted that there are many other manufacturers of ergonomic keyboards than Logitech.
While ergonomics is important, it is not the only consideration in a keyboard. Despite dust control systems, a workshop can be a dusty place. Thus a less ergonomic, but waterproof and dustproof Logitech Keys-to-Go Bluetooth keyboard has been purchased for use in this environment.
The mouse has its origins in a trackball invented in 1946 by Ralph Benjamin (1922 – 2019). The first pointing device resembling today’s mice included a 1963 design study by Douglas Englebart (1925 – 2013), and its implementation as a prototype in 1964 by Bill English (1938 – ). The 1973 Xerox Alto is regarded as the first modern computer to utilize a mouse.
The main challenge with using a computer mouse is the position of the hand(s). The natural position of the hand is closer to vertical, as if holding a saw, than horizontal. Logitech claims the ideal angle is 57°. When the hand is forced to turn further over (counter-clock-wise for the right hand) mouse operation becomes less comfort and can provoke carpal tunnel syndrome and other repetitive strain injuries. Its use can become extremely uncomfortable for people with arthritis. Thus, it is important to design a mouse to fit natural hand positions and movements.
There are a lot of ergonomic mice available. However, one of the best that is suitable for a wide range of users is the Logitech MX Vertical. Logitech claims it is effective in reducing muscle strain, and in reducing hand movement. It is made from rubber and aluminum. It can be used as a wired USB-C cable, or wirelessly using Bluetooth or with a Unifying USB-A dongle. One of the real challenges is finding an ergonomic mouse for someone who is left-handed.
There are times when a mouse is the wrong pointing device to use to input data. This is especially true when trying to work with graphics. While there are many products available, One by Wacom is inexpensive and simple to use, but not something a professional artist would aspire to. It connects to a computer with a USB cable. The active area is about 152 x 95 mm. Even though Wacom advises users to download drivers to PC and Mac computers, these were already installed on Linux Mint. Its pen is ergonomic, lightweight (no batteries), balanced, pressure-sensitive and comfortable to hold. It allows people to sketch/ draw/ paint/ edit graphic works, including photos.
Speakers, headphones and earbuds are often tethered to a host device. Some few use a 1/4 inch phone connector, dating from 1877. More often a 3.5 mm connector is used. These date from the 1950s, but were popularized in the 1964 Sony EFM-117J transistor radio. At one time they were universally provided in tip-ring-sleeve (TRS) versions offering stereo audio. USB connectors are also available.
Cordless headphones/ earphones receive a radio/ infrared signals from a Bluetooth, DECT, FM or Wi-Fi transmission link. The headphone is only part of a powered receiver systems.
Headphones are especially important in spaces that are shared. While primitive versions existed by 1906, the first fully functional headphones were invented by Nathaniel Baldwin (1878 – 1961) in 1910, and sold to the US Navy. John C. Koss (1930 – ) invented stereo headphones in 1958. Earphones aka earbuds have existed since at least 1984. They too are increasingly cordless. However, because of the small size of their batteries, they often run out of battery power, if used for long periods.
At this point it should be pointed out that open office landscapes typically create un-wellness. To understand the full implications as to why open offices are such a bad idea, people are encouraged to read The Open-Office Trap.
Note: A more extensive work on headphones is planned for 2021.
The most sensitive and least adaptable peripheral is the display/ monitor/ screen. With a relatively shallow desk, an ideal screen size is probably 27″ (70 cm) or less. With a deeper desk, the size could increase to 32″ (81 cm), especially if there are a large number of windows that have to be open simultaneously. In-Plane Switching (IPS) technology should be used to provide accurate colour and a wide (up to 180°) viewing angle. It should be easy to height-adjust, tilt, pivot and swivel the display. In addition, the display should be flicker free and have reduced blue light, especially after dark. It can be advantageous for a display to have built-in stereo speakers. However, if the room is shared, headphones should be provided and used whenever two or more people occupy the room.
If the display cannot be positioned satisfactorily, then one must consider repositioning the desk. Ideally, a display should be placed at right angles to, or away from, windows and other light sources so it does not create/ reflect glare. Glare may cause eye strain. Once this is done, the display should be centred directly in front of the user.
Desks and chairs are usually not considered peripherals, but they are important for maintaining health, especially if one is going to be working with a device for many hours during the day. Many people find a height adjustable desk ideal, because it allows them to stand or sit depending on their mood. Height adjustments for growing children should made, say, two – four times a year. Adults should probably check things annually. Pregnant women may also want to make frequent adjustments.
Adjustments. Begin by sitting. First, adjust the chair seat height so that feet are comfortable on the floor. Adjustments to the back support can be made at the same time. Second, adjust the desk height so that hands feel comfortable on the keyboard and mouse. Note this position. Third, adjust the display/ monitor/ screen height so that it can be seen comfortably without strain. Place the top of the screen at or slightly below (0 – 30°) eye level. It should also be place about an arm’s length away from the user, so that the entire screen can be viewed comfortably. Fourth, stand and remove the chair. Adjust the desk height once again so the hands feel comfortable using the keyboard and mouse. Note this position. The display should need no further adjustment.
Purchasing/ Repurposing/ Disposal
Many people amass numerous peripherals over time. Thus, even when a peripheral fails, it is not necessary to buy a replacement immediately. This means that it is possible to schedule purchases over a period of time. Yet, because things can be damaged, such a schedule has to be flexible, allowing for a change in priorities.
Sometimes I become very disappointed with a piece of equipment, and want to return it. One reason I purchase products from the Norwegian electronics chain, Power, more than many other suppliers is their 30 day return policy. They allow people to return any product after 30 days for a full refund. This also applies to equipment that has been used. On the other hand, I also buy returned products from them at a discounted price.
Some of the recommended peripherals here have not yet been purchased, yet! Many are scheduled to be purchased at some time in the future. For example, my current display, a Samsung SyncMaster S27B350, was purchased 2012-11-09, so it is not even eight years old. It functions adequately, but is not height adjustable and does not have many of the other features discussed. Because I have some vision issues related to blue-light exposure, its replacement is nearing the top of the replacement schedule. It will probably replaced in the coming year with something similar to a Benq BL2780T. The Samsung display will be repurposed. It is far too good to simply discard. It will probably find a new home in the workshop, where it will still be used, but not so extensively.
Another alternative is to give products away to people who would otherwise be unable to afford them. However, I do not use other people as recycling stations for products that have met their end of life. These are disposed of at the local recycling station. Here, there is even a safe, for the proper disposal of media containing data.
Clowns supervised by monkeys, is a description of Boeing that comes from one of its employees in an email in 2017. After two fatal crashes of 737 Max aircraft killed 346 people because of faulty Maneuvering Characteristics Augmentation System (MCAS) software, a flight control subsystem designed to enhance pitch stability.
In addition, there have been multiple problems with 787 Dreamliners. Some of the problems involve leaking fuel valves and lithium-ion battery problems. Most recently, in 2019-12, it was revealed that Boeing removed copper foil that formed part of the lightning strike protection from wings of the aircraft.
Additional questions are being asked after yet another 737 made a “rough landing” on 2020-02-05 at Istanbul’s Sabiha Gokcen airport. Three people were killed and 179 injured, of the 183 passengers and crew on board. Adding to this is a question about the legitimacy of the report on the 2009-02-25 crash of Turkish Airlines flight 1951.
This crash is the subject of Mayday episode 72 (aka Series 10 Episode 6) “Who’s in Control?” first shown 2011-02-28.
New York Times journalist Chris Hamby claimed in 2020-01 that the investigation either excluded or played down criticisms of the manufacturer in its 2010 final report, after pushback from Boeing and American National Transportation Safety Board (NTSB) officials. The Hamby article uses a 2009 human factors analysis by Sidney Dekker. In 2020-02, it was reported that Boeing refused to cooperate with a new Dutch review on the crash investigation and that the NTSB had refused Dutch lawmakers’ request to participate.
Then there is the KC-46 Pegasus, a military aerial refueling and strategic military transport aircraft developed from the 767 airliner. Numerous issues include its remote vision system, refueling boom, delivery with loose tools and other debris left inside planes after manufacture.
While not all issues are software related, several are. There seem to be significant flaws in Boeing’s software verification process. The heart of the problem is that Boeing has been given permission by the American Federal Aviation Administration (FAA) to certify its own designs. That means that Boeing regulates itself.
The verification of software takes considerable effort, and expertise. Some experts claim that it takes an order of magnitude more (a fancy way of saying ten times more) to verify a software program than to develop and test it. Many also conclude that it takes a special type of person, frequently someone on the autism spectrum, to undertake such work. For extroverts, and other people far removed from autism, dealing with system verifiers can be problematic.
Airbus and Boeing refuse to compete on the basis of safety. Both companies pretend that they are equally safe, and that the only metric that needs to be taken into consideration by airlines is price. Unfortunately, safety is an issue, and some inconvenient metrics demonstrate this. The Airbus A320 family of aircraft competes against the Boeing 737 family. Airsafe’s fatal crash rates per million flights puts Airbus A320 family rate of 0.08 in contrast to Boeing 737’s family rate of 0.23 (Almost three times higher).
Lou Whiteman, an analyst at the Motley Fool, wonders if Boeing should be split up. He reasons that Boeing is too large and complex to manage effectively. The result is a series of blunders. Because of the dominance of Boeing, any failures have a massive impact on the entire U.S. economy.
Beyond Boeing to modern business culture, one of the challenges facing many companies is the use of extraverts as executives. These often have an ability to speak for themselves, even promoting themselves as executive material. Yet, an ability to listen may be, if not lacking, regarded as of secondary importance. Worse still is the situation where sociopaths and psychopaths become executives. Readers interested in the challenges posed by extroversion are encouraged to read Susan Cain, Quiet: The Power of Introverts in a World That Can’t Stop Talking.
I’m allowing Fugboi to have the final comment originally posted as a comment in the Mentour Pilot video: “What’s wrong with Boeing? Answer: MCAS (Money Comes Above Safety)”
Computer devices are dependent on electricity to operate. Increasingly, devices use battery storage/ power, to gain a temporary independence from the electrical power network. Various forms of small scale, local energy production (solar, wind) can even lead to a more permanent independence. However, not everyone is in a position to become permanently independent from the grid.
An electrical power blackout/ cut/ failure/ outage is the loss of electrical power somewhere in the network affecting the supply to an end user. These may be caused by faults/ damage at power stations, substations, transmission lines, short circuits, circuit breaker operation. A transient fault is a temporary loss of power, that is automatically restored once the fault is cleared. A brownout is a drop in voltage in an electrical power supply. Brownouts can cause operational problems. A blackout is the total loss of power to an area. They may last from minutes to weeks depending on circumstances. Rolling blackouts are planned blackouts that occur when demand exceeds supply. It rotates customers, so that at any given time some receive power at the required voltage, while others receive no power at all. Preventative blackouts are also used as a public safety measure, for example, to prevent wildfires around poorly maintained transmission lines.
Batteries in electric vehicles as well as solar panels are ensuring that there is always a minimal amount of electric power available, even if there is a grid related blackout. Circuits have to be designed so that electricity is not fed into the grid at these times, because that power could represent a hazard to people working on the lines to restore power.
Desktop and Tower computers
There are many different form factors used to make desktop and tower computers. These motherboard specifications determine dimensions, power supply type, location of mounting holes and ports. This ensures that parts are interchangeable over time. Two are especially important. The ATX specification was created by Intel in 1995, and is still the most popular form factor. Here the power supply offers several different voltages to meet the specific needs. These include +3.3 V, -3.3 V, +5 V, -5 V, +12 V and -12 V. There were and are attempts to offer just 12 V, but this reduces the selection of components available.
This type of computer has a power supply built into the computer case. It is suitable for power intensive assignments such as gaming or video rendering. Some can provide 500 W of power. The tower format is especially attractive if a large number of expansion cards are needed. One challenge with these is the need for active cooling. Fans can be noisy. The open nature of the case means that it will attract and accumulate dust, which will have to be cleaned at regular intervals.
The Mini-ATX family of motherboard specifications was designed by AOpen in 2005 with Mobile on Desktop Technology (MoDT). This adapts mobile CPUs for lower power consumption. With a simple but effective thermal design, it is suited for passive cooling, making it virtually silent. Manufacturing costs and overall operating power requirements are lower relative to active cooling designs. A DC-to-DC converter removes the power supply from the case, reducing system size. This type of computer is suitable for general use. The number of expansion cards is limited.
Almost all modern laptops use Lithium Ion (LiIon) batteries and use 19 V chargers. Each battery arranges LiIon cells in series. A LiIon cell has a maximum charging voltage of 4.2 V, although slightly more voltage is applied in practice. Voltage needs: One cell = 4.2 V; two cells = 8.4 V; three cells = 12.6 V; four cells = 16.8 V; five cells = 21 V. A charger uses switched mode power supply (SMPS) to convert the supply voltage to required voltage, sometimes using a boost converter (steps voltage up), but usually a buck converter (steps voltage down). A 19 V buck converter could charge up to 4 cells in series. When a LiIon cell is close to fully discharged it’s terminal voltage is about 3 V. A buck converter can accommodate this reduced voltage to maintain its charging efficiency, which can exceed 95 %.
There are three charger plugs in common use on handheld devices: micro-USB-B, USB-C and Apple Lightning. A cable will typically have a connector for one of these standards on one end, and a standard USB-A 2.0 connector on the other, that can be plugged into a charger, which – in turn – is plugged into a wall socket.
The micro-USB and lightning connectors will likely disappear in Europe, as the European Union has mandated use of a common charging connector, which will probably end up being USB-C.
USB4 & USB-C
As stated in an earlier post, USB originally allowed power to flow downstream from a Type-A connected device to a Type-B connected device. This situation has changed with the introduction of USB-C connectors, which combine A and B characteristics. It is not easy to see where power is flowing.
Type of Delivery
Low Power USB 3.0
High Power USB 3.0
Battery Charging 1.2
Power Delivery 1.0 Micro-USB
Power Delivery 2.0/3.0 USB-C
One of the most common devices needing power to operate are external drives. For example, Western Digital My Passport external hard drives that feature 2.5 inch drives, are powered by the computer, using a USB Type-A connector for data and power. In contrast, Western Digital My Book external hard drives feature 3.5 inch drives, that also come with a USB Type-A connector for data, but with an AC adapter with wall socket plug for power.
USB Battery Charging defines a charging port, which may be a charging downstream port (CDP), with data, or a dedicated charging port (DCP) without data. Dedicated charging ports on USB power adapters can run attached devices and battery packs. Charging ports on a host are labelled such.
In the computer world, one is perpetually in a transition period. There is always something newer and potentially better, and something older but tested and, hopefully, more reliable. Thus, a Western Digital G-Drive comes with three cables: USB-C to USB-C, USB-C to USB-A and a power cable that attaches to an AC power adapter. For computers that can provide power over a USB-C connector, only the first cable is needed. For computers without USB-C power, this first cable connects the AC adapter to the G-Drive, with the second cable connects the G-Drive to the computer.
Power over Ethernet (PoE)
With domotics (smart houses) becoming increasingly popular, it is becoming increasingly advantageous to install and use Ethernet cables to ensure devices are able to communicate effectively. While some switches only provide data transfer, many switches offer power, using one of the many Power over Ethernet standards.
There are two different approaches to providing power. Power sourcing equipment (PSE) that provide power on the Ethernet cable, typically a network switch, is called an endspan or endpoint. The alternative is an intermediary device, a PoE injector, also referred to as a midspan device.
A powered device (PD) is any device powered by PoE. One common device is a room controller which has sensors collecting data about a room, and actuators such as a solenoid capable of opening and closing a heating vent. In the future many computing devices should be able to receive power directly from an Ethernet cable.
PoE is expected to become a DC power cabling standard, replacing individual AC adapters in a building. While some are concerned that PoE is less efficient than AC power, others argue that it is a better solution because a central PoE supply replaces several/ many AC circuits with inefficient transformers and inverters, compensating for any power loss from cabling.
Inductive charging involves wireless power transfer, using electromagnetic induction to provide electricity to portable devices. The most common application is the Qi wireless charging standard. Devices are placed near an inductive pad. There is no need for electrical contact with a plug, or for devices to be precisely aligned, for energy transfer. Advantages of wireless charging include: corrosion protection and reduced risk of electrical faults such as short circuits due to insulation failure; increased durability, because there is significantly less wear on the device plug and the attaching cable; No cables; Automatic operation.
There are some disadvantages: Slower charging, devices can take 15 percent longer to charge; more expensive because of drive electronics and coils in both device and charger; there is a certain inconvenience cause by an inability to move a device away from its pad, while it is being charged; there is an assortment of incompatible standards that not all devices support; devices get hot when charging, and this continued exposure to heat can result in battery damage.
The Qi standard is supported on devices fro Apple, Asus, BlackBerry, Google, HTC, Huawei, LG Electronics, Motorola Mobility, Nokia, Samsung, Sony and Xiaomi. Released in 2008, the Qi standard had by 2019 been incorporated into more than 160 handheld devices.
Uninterruptible Power Supply
An uninterruptible power supply (UPS) provide electricity to an attached device if the main supply becomes unavailable. The duration of this protection varies. This is particularly important for desktop and server devices that do not have built-in batteries, in contrast to those with batteries, found in handheld devices and laptops. Even if a UPS is designed for mission critical equipment, it can be nice to have in a residence. Perhaps the most important device to connect is to a UPS is the router, to allow communication outside the residence. The local internet service provider (ISP) should have the rest of the network protected with their own UPS. What they actually provide varies.
Data or voice communication with a handheld device on cellular network can be the most effective way of communicating in an emergency situation, during a blackout. This is because the cellular base stations should have their own backup power sources, allowing them to operate normally.
All electronic devices should be protected against surges = situations where voltages increase above specified levels, even if just for one or more seconds. Surge protectors can prevent this form of damage. This is probably best provided by having surge protection built into the circuit-breaker box.
Some technical journalists are more dramatic than others. One described a display/ monitor/ screen as a window opening onto the soul of a computer, and the purchase of an appropriate one, a make or break experience. Both statements are exaggerations. Within a given budget, one must attempt to optimize the value of each component, and find a compromise.
There are several terms that will have to be understood to appreciate video connectors. Source, here, will be used generally to describe a machine producing a video signal, possibly through the internet, or stored as a file, or produced by a program. A display is a video screen capable of showing video images. A source is usually connected to a display using a cable. The ends of the cable, as well as the physical ports on the machines are connectors.
Most of the time graphic content on a display is something relatively static, including a desktop image, a text being written, or a list of files. At other times, the content may be more dynamic, such as a video film or a game.
A video connector is often overlooked because it is smaller than a display. Yet, it determines which displays can be used. This applies to laptop as well as desktop machines, home media centres, and handheld devices. There are a lot of standards associated with computer displays. Wikipedia lists 69 display standards, 22 signal and 32 connector standards. Fortunately, most of them can be forgotten as relics of the past.
One of the challenges with writing about video connectors is that there are several groups of intensive video users, who have specific demands. Some of these people are video editors, others are film fanatics, but most are gamers. They probably know very precisely what they want, and have other sources for advice. Intensive users will probably want to use a DisplayPort, version 2.0 on both source and display, connected using an ultra-high bit rate 10 (UHBR 10) cable. Mac users with USB-C connectors will want to connect to USB-C displays. With these extremists out of the way, the rest of this weblog will consider the needs of more ordinary users.
Some comments about specific video connectors
In computing’s Middle Ages, a Video Graphics Array (VGA) connector emerged in 1987. It was used on video cards, computer displays, laptop computers, projectors and even high definition television sets. Later, a smaller mini-VGA port was sometimes provided on laptop computers. This connector is still found on large numbers of sources and displays.
The Digital Video Interface (DVI) was designed to replace VGA in 1999-04. It can transmit uncompressed digital video in three modes: DVI-A (analog only), DVI-D (digital only) or DVI-I (digital and analog). It is backward compatible for use with VGA displays.
High-Definition Mulitimedia Interface (HDMI) dates from 2002-12. It is backward compatible with DVI-D and DVI-I, but not DVI-A. There are five types of connectors in use Type A = standard-HDMI; Type B = dual-link-HDMI (never used); Type C = mini-HDMI; Type D = micro-HDMI; and Type E = Automotive Connection System (ACS).
DisplayPort dates from 2006-05. It is backward compatible directly or indirectly, with all of the previously mentioned connectors. All DisplayPort cables are compatible with all DisplayPort devices, but may have different bandwidth certification levels (RBR, HBR, HBR2, HBR3). The major difference between DisplayPort and HDMI, is that while HDMI originated with consumer electronics, DisplayPort was oriented towards computer standards, especially at the more expensive end of the market. While there are several other DisplayPort connectors, only the Full-size DisplayPort and Mini-DisplayPort connectors will be discussed further.
What types of connectors are being used on each machine? Machines can be divided into sources and displays. Sources include different types of handheld devices (aka mobile/ cell phones, tablets), laptops, desktops and media players. Displays include monitors, televisions and projectors.
If any sources or displays support only VGA or DVI connections, these units should be considered for replacement. Modern TVs and displays have HDMI ports, DisplayPorts and/ or USB-C ports. Laptops typically have HDMI, micro-HDMI, standard DisplayPort or Micro-DisplayPort or, increasingly, USB-C ports.
The easiest connections are between machines that use the same family of connectors. It doesn’t matter what type of DisplayPort plug one is using, as long as the connector ends match those on the devices. One should buy cables with the certification level of the most advanced device (source or display).
A similar situation occurs with HDMI. The ends have to match, and the cable quality should mirror that of the best machine.
If any source equipment features DisplayPort Dual-Mode (DP++) this is a standard which allows DisplayPort sources to use simple passive adapters to connect to HDMI or DVI displays. Again, it is a relatively simple solution.
If both source and display have HDMI ports, an HDMI cable can be used. If different technologies are used an adapter, such as HDMI-to-DisplayPort, or vice versa, can be used. Similarly, there are cables with VGA, DVI, Micro-DisplayPort, DisplayPort or USB-C connectors at one end and HDMI at the other end.
A USB-C to HDMI / USB-C/ USB 3.0 multiport adapter is practical for computers using USB-C for AV output to displays, televisions and projectors, with video resolutions up to 4K (3840X2160P/30HZ). Source devices include MacBooks, Chromebooks, as well as several Windows and Linux devices.
Using this approach with laptops gives all of the disadvantages of a desktop without any of its the advantages. The laptop may not have enough ports for everything. It can be time-consuming and frustrating unplugging then reattaching peripherals after every move.
Previously, docking stations solved these problems. Peripherals, such as keyboard, mouse, display were plugged into the docking station, along with the laptop. Unfortunately, these were often brand and sometimes even model dependent. General-purpose docking towers are now available.
A Mobile High-Definition Link (MHL) adapter and a standard-HDMI to standard-HDMI cable is used to connect Android devices to HDMI displays. Here are the steps to follow.
Use a MHL (micro-USB to standard-HDMI) adapter.
Plug the male micro-USB (small) end of the adapter into the Android device.
Plug one end of a HDMI cable into the MHL adapter.
Plug the other end of a HDMI cable into a TV.
Turn on the TV.
Change your TV’s input to the relevant HDMI port.
Wait for the Android screen to display on the TV.
To work the depicted adapter needs a 5V 1A external power supply (5V 1A) to.power the adapter and to charge the Android device. Plug the charger’s male micro-USB end into the female micro-USB port on the adapter.
Note: When using such an adapter for the first time, the phone must be rebooted after the adapter is connected to the phone, otherwise there will be no HDMI output.
Similar procedures need to be followed to connect an iPhone Lightning connector to a HDMI connector on a television using a adapter. Once again, but depending on the adapter, it may be necessary to have the charger plugged into the adapter for it to work. Others only need to be plugged in if the iPhone needs to be charged.
USB4 & USB-C
As stated in the previous weblog post, the future of video connectivity is the USB4 protocol and the USB-C connector. While every USB-C port looks the same, they do not all provide the same functions. This is especially true for both power and video. Its main advantages include high data/ video throughput, and ability to transfer electrical power. This means that USB-C monitors will get power as well as data from the source device they are connected to.
Why do I subject myself, and readers, to this massive outpouring of words, that a weblog represents? There are two reasons. The first is my addiction to writing, combined with an inability to keep a diary. When I try to write a diary, which used to happen about once a decade, it is more a reflection on the day’s events, than a chronicle of them. My determination as a diarist lasts about a week. There are a thousand excuses for any failure to write, not finding the diary being paramount.
The second reason involves brain stimulation. Novelty releases dopamine in the brain. which, in turn, stimulates the amygdala, the site of emotion, and creates a pleasurable feeling that is associated with the new activity. Fun encourages more fun, which is what writing a weblog post should be about.
A weblog suits my personality: my mind prefers to focus on topics, rather than on events, or on people; a keyboard is faster than a pen. Yet, despite its slowness, I still use pen and paper to write other texts, for handwriting is an skill that should be retained and sometimes promoted. My one vanity is a pride in my handwriting, which – when I take the time – is small and legible and produced using liquid blue ink. I have tried black, red and even green, but prefer blue.
Changes are easier to make with word processors, than they are with pen on paper. While it may not be any easier to start writing a blog than to begin an entry in a diary, but it is easier to continue again at a later date, and to make corrections. Innumerable drafts can be kept, that can be published in any order, or even discarded. Not even my hairdresser knows how many drafts I have stored away, awaiting further inspiration. As I write this sentence, there are 330 posts in total, 293 published, 6 scheduled, 31 in draft form, and one in the bin. That is the number on my private website, not on the Keywords website or inside my computers or on external drives.
As I write these lines my mind returns to the early 1980s, and to the technology available at the time. A computer terminal can show what was right, and what was wrong with an era that introduced the masses to computing.
The Digital Equipment Corporation (DEC) VT 100 terminal was a companion for three years. Designed for computer programming, rather than word processing, with 80 columns and 24 lines of white text against a black background. The monitor/ screen was fixed in position, too low and unsuitably angled. The keyboard was far too high, and certainly not ergonomic. The user was expected to adapt to the machine.
The Norsk Data (ND) system, with its ND-Notis word processor was a vast improvement. The Tandberg TDV-2200 terminal was the most ergonomic terminal in the world at the time, its brown and beige monitor/ screen featured green text on a black background, and was fully adjustable in every direction. Its beige, brown and orange keyboard, was a delight to use. I used such a system, in two different environments, for four years. Even the colour scheme was explained in ergonomic terms, with claims that the colours chosen were more restive than any other colour combinations available. The machine was expected to adapt to the user.
It is easy for old men to focus on the past, but that is not what I want to do here. I want to encourage people of multiple ages and genders and ethnicities and interests to reflect on their lives, and on society, and then to express themselves. How will people in a future age know what it was like to live at this time, if you don’t tell them? How will people in distant lands, know what you experienced, are experiencing? Help them to understand, and you might also begin to understand yourself, and your place in the world.
A weblog does not have to be written words. It can be words or music as sound waves, or assorted movements as video. At the moment, it is more problematic to diffuse smells and tastes, but a time may come when this is possible. Poetry, drama, comedy, performance are all valid forms of expression. It could be images: photographs as works of art or mirrors of contemporary life; drawings, paintings, collage; sculpture that is knitted, cast in bronze or chiselled away from stone; culinary masterpieces. Craftsmanship is of particular importance to me, and I admire all forms of it: woodworking, metalworking, moulding in plastic, papier-mâché, sewing.
As is my habit, I encourage people to write weblogs, rather than to use social media. Restrict it to your best friends and closest family, and perhaps a couple of people you don’t know that well, to keep it honest. Write about the topics that interest you. Perhaps you know something about tractors, or dogs or playing congas.
With 7.7 billion people, the world is being consumed, so there is less nature for other species to inhabit, and ourselves to enjoy. We – in the west – will have to learn to consume less, so that others may fulfil their lives. Hopefully, this will also mean that we will work less, and spend our free time in pursuits that benefit others as well as ourselves. Even billionaires would benefit, if there was increased equality. Perhaps you could find, then document, ways of doing things better, that use less energy or water or materials. Perhaps you could invent, then document, new ways of having fun: A new game, dance, sport or toy.
This weblog post is published on the 42nd anniversary (2020-02-16) of the opening of the CBBS (1978-02-16), the world’s first bulletin board service. It also commemorates the life of Randy John Suess (1945-01-27 – 2019-12-10). Born in Skokie, Illinois, Suess served for two years in the U.S. Navy, before attending the University of Illinois at Chicago Circle. He worked for IBM and Zenith.
In the 1970s, Suess was an amateur radio operator, with call sign WB9GPM. He was an active member of the Chicago FM Club, where he helped with maintenance on their extensive radio repeater systems.
However, Suess is most famously remembered as the co-founder and hardware developer of the first bulletin board system (BBS), along with partner and software developer Ward Christensen (1945 – ). They met as members of the Chicago Area Computer Hobbyists’ Exchange, or CACHE about 1975. Development of the BBS started during the Great Blizzard in Chicago, and was officially online (an expression not used at the time) as CBBS = Computerized Bulletin Board System, on 1978-02-16.
The early development of this and other bulletin board systems is documented in a previous weblog post, and more extensively in BBS: The Documentary an 8-episode documentary series created by computer historian Jason Scott, made from 2001-07 to 2004-12.
The CBBS consisted of a S-100 computer with an 8-bit, ca. 1 MHz processor, 64kB RAM and two single-sided 8″ diskettes each holding 173 kB formed the basis of the system, along with a Hayes MicroModem 100 running at 300 baud. The operating system was CP/M, but it ran other software was written in 8080 assembler, and automatically loaded whenever someone dialed in at: 312-545-8086.
Attention to detail was important for the survival of the system. The floppy disk drive motors ran from mains electricity, and quickly burned out if left on. So the system was modified by Suess to turn itself on when the phone rang, and to keep going for a few seconds after the caller had finished to let the computer save its data, and then quietly go back to sleep. A unique feature of CBBS, was that if callers typed inappropriate words, these would be recognized and the system would log the caller out. Entering too many unproductive keystrokes would have the same effect.
Suess hosted CBBS, because his house in the Wrigleyville section of Chicago could be called without paying long-distance charges by anyone in Chicago. By the time the system was retired in the 1980s, its single phone line had logged over 11 000 unique users, and received more than 500 000 calls. A version of CBBS has run periodically, more than forty years after its inception, demonstrating the state of technology at the end of the 1970s.
Because of his interest in Unix systems, in 1982 he created the world’s first public-access Unix system, then called “wlcrjs”. In 1984 this became Chinet (Chicago Network), which connected to the internet through a satellite radio. It ran on one of the earliest Compaq portable machines, with a 4 MHz 8088 processor, 640kB of memory, and a 10 MB hard drive.
In the early days of Chinet, the Internet was still a research tool, unavailable to the general public. E-mail and newsgroup accounts were obtained from a university computer or a BBS-like system. There were no ISPs (Internet Service Providers). E-mail and newsgroup postings were relayed from one computer to the next using UUCP (Unix-to-Unix Copy), a suite of computer programs/ protocols allowing remote execution of commands and transfer of files, email and netnews between computers, mostly at night, mostly over regular telephone lines at 1200 or 2400 Baud. The entire content on the internet was so small that it could be downloaded in a single evening. This meant that Chicago area users could browse a global collection of data without paying long-distance telephone rates.
In the late 80’s, Chinet had 12 dialup ports for between 300 and 600 active users. Later these were replaced with 22 phone lines that connected to a bank of modems, operating at considerably higher speeds. Eventually, the UUCP model became obsolete, with more companies getting direct Internet access, and ISPs offering inexpensive access to consumers. Chinet’s dial-up port usage started to decline.
Chinet then started using PicoSpan to replace its BBS software. Eventually, yapp (Yet Another Picospan Program) replaced PicoSpan and remained in use until Chinet migrated from Unix shell-based access to web based interfaces in the late 1990’s.
Despite a fire in 1996-05, Chinet still exits today, entirely web-based, running Simple Machines Forum software on a Debian GNU/Linux system.
Suess was married twice, first to Agnes Kluck and then to Dawn Hendricks, both marriages ended in divorce. He had two daughters, Karrie and Christine and one son, Ryan.
Randy Suess died 2019-12-10 in Chicago, Illinois. Currently, CBBS is operative, with information available about Randy Suess and his death.
Full disclosure: I am a registered user of CBBS/ Chinet, Chicago.
Conclusion: If you are considering investing in new computing devices carefully consider ones equipped with USB-C ports. This is the technology of the present and future. USB-A, USB-B, Mini-B and Micro-B connectors are relics of the past. However, devices with them should not be discarded. Cables with two different USB types on each end enable interconnectivity.
The Universal Serial Bus (USB) standard simplifies and improves the interface between a host computer and peripheral devices, with minimal operator action. Its interface is self-configuring. Connectors are standardized, so any peripheral can use any available receptacle. Usually there are no user-adjustable interface settings. The interface is “hot pluggable”, so peripherals can be used without rebooting the host. Small devices can be powered directly from the interface. Protocols for recovery from errors are defined, increasing reliability.
The USB standard specifies cables, connectors and protocols for connection, communication and supplying power to and between computers (Host or A-side) and peripheral devices (B-side). Peripherals include external disk drives, keyboards, mice and printers. However, they are not the only connectors. Screens typically use HDMI and other video connectors, the subject of a later weblog post. Similarly, Ethernet cables are preferred for connecting desktop computers and other devices to computer networks.
Today, most new Android handheld devices aka smart phones use USB-C (type C) connectors for charging and data transfer. Older Android phones have a Micro-B port. Apple iPhones (since iPhone 5) and most iPads use a Lightning connector. While the USB-C and Lightning look very similar, they are not interchangeable.
Fourth-generation MacBook Pro laptops, first released in October 2016, use USB-C for all data ports, and for charging. Windows laptops using USB-C ports for charging include some Acer Aspire, Asus Zenbook, Dell XPS, HP Sprectre and Lenovo ThinkPad models. Many other laptop models use an assortment of chargers, usually incompatible with everything else.
While the European Union has relied on consensus to standardize handheld device connections, this has not worked. While most manufacturers use USB-C connectors, Apple uses a Lightning connector. Now, the EU has said it will legislate compliance that will force all providers of handheld devices, including Apple, to use USB-C connectors. When implemented, this will probably have implications for the entire world.
USB connectors are at the heart of legacy-free computers, a term devised by Microsoft to describe a PC without a lot of the equipment previously found on beige boxes. Much of it large and slow. Most users appreciate this redesign, and especially the fact that a legacy-free PC must be able to boot (start up) from a USB device. The exception is that gamers, because of latency (time delay) issues, want to retain their PS/2 keyboard connectors.
Work on USB began in 1994, with a mandate from seven companies: Compaq, DEC, IBM, Intel, Microsoft, NEC, and Nortel. The goal was to replace a multitude of connectors with a single interface, and to increase data rates to and from external devices. A development team was established by Intel at Folsom, California led by Ajay Bhatt (1957 – ). The first integrated circuits supporting USB were produced by Intel in 1995.
The USB standard is currently maintained by the USB Implementers Forum (USB-IF), with four generations of specifications: USB 1.x (1996), USB 2.0 (2000), USB 3.x (2008, 2013, 2014, 2017) and USB4 (2019).
USB 1.0 ports are no longer relevant. However, efforts have been made to make all interfaces made after that backwards compatible. Thus, USB4 is backwards compatible with everything between USB 3.2 and USB 2.0.
USB 2.0 2000-04
USB 3.0 2008-11
USB 3.1 2013-07
USB 3.2 2017-08
USB 1.0 from 1996-01 provided 12 Mbit/s of data transfer. When USB 2.0 was introduced, an unshielded cable version allowed for the attachment of inexpensive peripherals at a data rate of 1.5 Gbit/s.
There are two versions of USB connectors one on a device and the other that fits into it on a cable. The device or female connector is referred to as a receptacle or port. The cable or male connector is referred to as a plug. Originally, a USB connection was always between a host or hub at the A connector end, and a device or hub’s upstream port at the B connector end. This is why peripherals, such as printers, use Type-B connectors. With handheld devices, the charger is regarded as the A end, while the device is regarded as the B end. Things are no longer so simple.
4 to 9
4 to 9
4 to 9
4 to 9
Insertion/ removal cycles
The Future: USB4 & USB-C
The future of USB connectivity is the USB4 protocol and the USB-C connector. Yes, both of these could be replaced, and probably will be sometime in the future, but both represent reality now. While every USB-C port looks the same, they do not all provide the same functions. The main functions are data, video and power transfer. It is not possible to tell by looking at the port what is incorporated. If there is no documentation stating otherwise, consumers have to assume that they are simply data ports.
The USB-C connector began its use using the USB 3.1 protocol, which allows data transfers at 10 Gbps, theoretically twice as fast as USB 3.0. The USB 3.1 protocol can also be applied to USB 3.1 Type-A ports. Note: USB-IF has created unnecessary name confusion in that 3.0 connectors are also known as USB 3.1 Gen 1 & USB 3.2 Gen 1 x 1 while USB 3.1 connectors are also known as USB 3.1 Gen 2 & USB 3.2 Gen 2 x 1. The new USB 3.2 port was referred to as USB 3.2 Gen 2 x 2. The x 1 or x 2 refers to the number of lanes.
USB4 incorporates the Thunderbolt 3 protocol into the USB mainstream. The Thunderbolt interface was developed by Intel and Apple. It combines PCI Express and DisplayPort into two serial signals and provides DC power. Thunderbolt 1 and 2 use a Mini DisplayPort connector, whereas Thunderbolt 3 uses the USB-C connector.
Depending on phone, computer and vehicle configuration several different cable types may be required. In Scandinavia, Clas Ohlson stores offer a large selection, as do many specialist stores and online stores.
USB-Micro B to USB-A: Connect older Android phones to older chargers, older computers, and older automobile data ports.
USB-C to USB-A: Connect newer Android phones & newer iPhones to older chargers, older computers and older automobile data ports.
USB-C to USB-C: Connect newer Android phones to USB-PD (power delivery) chargers, USB PD compatible batteries, some computers (with USB-C ports), 12V car chargers, newer automobile data ports.
Lightning to USB Type-A: Connect most iOS devices to legacy automobile data ports for CarPlay
Lightning to USB Type-C: Connect most iOS devices to current generation macOS devices using USB PD compatible batteries, wall chargers, and 12V car chargers.
USB Type-A to proprietary cable/magnetic connector charger for Apple Watch or Samsung Galaxy Gear/Android Wear or older Apple Mac equipment.
Additional information on USB-C ports will be presented in the two next weblog posts in this series: 7. Video connectors (2020-02-18) and 8. Power supply/ charging (2020-02-25).
An Aside: PS/2 keyboards
First, a PS2 keyboard is not a PS/2 keyboard. The former refers to Sony’s Play Station 2, launched in 2000. The latter, to ports on IBMs third-generation personal computer, the Personal System/ 2, launched in 1987. It is an ancient and outdated system. Yet, gamers often prefer to use PS/ 2 keyboards (and sometimes even mice) for several reasons. First, PS/2 is analog. Whenever a button is pressed, it sends a command to the computer immediately. This contrast with USB, where the computer polls USB ports, and through them attached devices, at a rate dependent on the frequency of the port. Previously, this was about 125 Hz or so, which could introduce a latency (delay) from 0 to about 8 ms. Polling frequency on modern computers is about 1000 Hz, which reduces this latency to a maximum of 1 ms.
PS/2 keyboards also support N-Key rollover, which allows users to press as many keys as they want, simultaneously. This was not possible with USB keyboards, however many newer USB gaming keyboards support this now.
PS/2 peripherals work immediately without drivers. This is especially useful when diagnosing motherboard and related problems that USB devices cannot detect.
PS/2 devices are not hot-swappable. If a device is plugged into a PS/2 port when a computer is operating, the machine will have to be restarted for the device to function.
Unlike keyboards, USB mice have an adjustable polling rate, allowing them to have polling rates of up to 1000 Hz. Thus, they have had far fewer issues than USB keyboards.
PS/2 hardware is being phased out, and is unavailable on many modern gaming motherboards = a printed circuit board containing the main computer components along with various connectors. Unless that hardware is built into the motherboard, there is no point in using PS/2 equipment, and no point in using a USB adapter to correct any of USB’s deficiencies. At that point it is “game over” for PS/2, and the user might as well use USB peripherals.
Full disclosure. At one time, I had responsibility for laundry, washing the family clothes/ bed linen/ textiles more generally. Unfortunately, my supervisor was not totally satisfied with the results, and I was fired from that position. Since then, and a couple of washing machine purchases later, that same supervisor is still in charge of laundry. Thus, this weblog post is being written as a spectator to the washing process, and not as an active participant in it.
What does environmentally friendly clothes washing look like? At one time, say, until minutes before starting to write this post, I thought it was all about energy savings. Lower temperatures used less hot water. Now, I find that there are other considerations. A University of Leeds in-house article points to research on this topic.
Lucy Cotton, Adam S. Hayward, Neil J. Lant & Richard S. Blackburn, have written an article, “Improved garment longevity and reduced microfibre release are important sustainability benefits of laundering in colder and quicker washing machine cycles”, that appeared 2020-01-14 in Dyes and Pigments. The article’s abstract can be found in the last paragraph of this weblog post.
Clothes and bed linen have to retain their integrity: neither falling apart, nor sending microfibres into waste water, nor sapping textiles of their colour, nor shrinking. For decades now the washing temperature chez McLellan has been set to 50C, unless there is some specific reason for decreasing/ increasing it. The most common reason for decreasing the temperature, is that the laundry load consists of garments made from wool. These items are washed at 30C. We use Milo, a laundry liquid, for wool (and silk) textiles. We use Neutral Colour, a laundry powder, for everything else. One reason for increasing temperature could be an attempt at stain removal.
A Guardian article about the research, explained that 25C was selected as the working temperature, because that is a common inlet temperature for water in washing machine in England – the natural, unchilled and unheated temperature at which the water enters the drum. I am looking forward to gaining some empirical data on Norwegian inlet temperatures. Currently, my estimate is that they are closer to 10C. A post will be written after data has been collected and analysed, with updated information.
Cotton’s research did not test cleanliness at different temperatures. Instead, it examined the release of dye (desorption) and microfibres from a range of consumer clothing. So, the pesky question remains: Can cleanliness happen in a cold, quick wash?
Cleanliness has a social and cultural dimension beyond the requirements of hygiene for practical purposes. Numerous internet sources are quick to state that higher washing temperatures improve cleanliness, but there is seldom any empirical data associated with it. There seems to be a common expectation that cleaning will not only remove dirt, but also kill microbes. Some people may expect textiles generally, and clothes especially, to have a pleasant odour, others may want textiles to be odour neutral. Some may even expect a miracle and have white (or more correctly, some version of off-white) whitened/ lightened/ bleached.
Yet, excessive cleanliness may not be a virtue. The hygiene hypothesis, developed by David P. Strachan holds that environmental microbes help develop the immune system; the fewer microbes people are exposed to in early childhood, the more likely they are to experience health problems in childhood and as adults. So, it might be a good idea to reduce washing temperature, to avoid killing so many microbes.
However, it may be suitable to wash clothes and bed linen at higher temperatures when people have been sick with contagious diseases, but to use lower temperatures more generally. The instructions on Neutral Colour specifically state that 60C should be used for washing bed linen, if the user is allergic to dust. The instructions on Milo state that both 40C and 30C can be used.
In an attempt to get empirical data on the relationship between cleaning effectiveness and washing temperature, one article explained the futility of the effort. “No more washing: Nano-enhanced textiles clean themselves with light.” Researchers at the Royal Melbourne Institute of Technology (RMIT) University in Melbourne, Australia, have developed a cheap and efficient new way to grow special nanostructures—which can degrade organic matter when exposed to light— and to incorporate them onto textiles. They are working at producing nano-material enhanced textiles that can spontaneously clean themselves of stains and other dirt when exposed to light, artifical (as in lamp) or natural (as in the sun).
Then, Wikipedia writes: “Animal studies indicate that carbon nanotubes and carbon nanofibers can cause pulmonary effects including inflammation, granulomas, and pulmonary fibrosis, which were of similar or greater potency when compared with other known fibrogenic materials such as silica, asbestos, and ultrafine carbon black. Some studies in cells or animals have shown genotoxic or carcinogenic effects, or systemic cardiovascular effects from pulmonary exposure.” It is definitely a hint that relying on nanoparticles, may not be the way to go.
Domotics, or household automation, is defined in an earlier weblog post. The main concern there was with room controllers. However, domotics intersects many different areas. Its early history, began with labour-saving machines/ devices, frequently electrically powered. Thus, initial advances were dependent on electric power reaching the houses of consumers.
In this post, laundry represents a very specific yet important application of domotics. Most people have a concern about the cleanliness of their textiles, but there needs to be a middle ground. People can be too slovenly or too finicky. Neither extreme is particularly good.
It can be difficult to date the first washing machine. Progress and significant developments occurred in 1691/ 1767/ 1862/ 1904/ 1937, depending how one defines a washing machine, and one’s expectations from it. However, it is not until the late 1940s, that one finds machines vaguely similar to today’s.
In Europe, washing machines have to meet an assortment of standards including 1. washing performance (A to G) 2. spin drying performance (A to G), 3. energy consumption (A+++ to G), and 4. waste water filtration. In addition to these, products must also be labeled to show capacity (in kilograms), water consumption (in litres), noise while washing and noise while spining (both dB (A)).
Only washing machines with category A performance in terms of washing and spinning, are allowed to be sold in Europe. This is because of an unexpected consequences of the earlier focus on energy consumption: Reduced rinsing after washing saved water and energy, but left more detergent residue in clothes, which created health problems, especially for people with allergies or sensitivities.
The waste water filters ensure that hazardous chemicals are not improperly disposed of, or allow feces and other waste to enter the washing machine with back-flow through the waste water line.
Most washing machines in Europe are connected only to cold water, which after filling uses internal electrical heating elements to heat the wash water, to 95C, if desired. This may be useful in certain situations, either to kill pathogens, or to improve detergent cleaning action. These machines can use detergents formulated to release different chemical ingredients at different temperatures, allowing different type of stains and soils to be cl eaned as the water is heated.High-temperature washing uses more energy, and may damage fabrics, especially elastics.
Temperatures above 40C inactivating biological detergent enzymes. Thus, if these are used, temperatures should be kept below 40C. Front-loaders used to need low-sudsing detergents, because the drum’s tumbling action caused over-sudsing and overflows. This is no longer an issue, if one reduces the quantity of detergent. This will not lessen its cleaning action.
Characteristics of some modern washing machines offer:
Delayed execution settings, with a timer to delay the start of the laundry cycle.
Time remaining indicator.
Predefined programs for different laundry types.
Rotation speed settings.
Temperature settings, including cold wash.
Child lock (some machines).
Microcontrollers have been used for timing processes since the 1990s. These are more reliable and cost-effective than electromechanical timers.
Touch panel displays have been in use since 2010.
LG Electronics invented a direct drive motor system in 1998, with the stator attached to the rear of the outer drum, with the rotor mounted on the shaft of the inner drum. It eliminates a pulley, belt and belt tensioner.
Electronically controlled motor speed.
Hydraulic suspension, with freely moving steel balls contained inside a ring mounted at the front and back of the to balance the weight of the clothes.
Other innovations include:
Water jets/ sprays/ showers and steam nozzles. These claim to reduce soil and reduce washing times. Water jets use recirculated water from the bottom of the drum.
Titanium/ ceramic heating elements claim to eliminate calcium build up and can heat water to 95C.
Automatic dispensers allow users to fill or replace tanks with detergent and softener but transfers dosage control to the washing machine.
Detergent dilution is used to prevent detergent from damaging fabrics.
Pulsators are mounted on a plate on the bottom of the drum. When the plate spins, the pulsators generate waves that help shake soil out of the fabric.
Mechanisms can be installed to remove undissolved detergent residue on the detergent dispenser.
Smartphones can be used to troubleshoot problems. LG Electronics LG’s approach involves sending signals that can be received by a phone. These signals are interpreted by an app. Samsung Electronics’ approach involves the user photographing the washer’s time display with a phone. The problem and steps to resolve it are displayed on the phone itself.
Some washing machines are Near-Field Communication (NFC) enabled. These allow automatic and fast connections, wireless data transfer and identification. Its downsides include limited throughput, limited range and potential security vulnerabilities.
Comparison of domestic washing machines. Front-loaders are usually considered better than top-loaders.:
Front-loaders usually have lower operating costs than top-loaders.
Front-loaders usually have a lower total cost of ownership compared to top-loaders.
Front-loaders tend to have more advanced features such as internal water heating, automatic dirt sensors and high-speed emptying compared to top-loaders.
Front-loaders usually use less energy than top-loaders.
Front-loaders usually use one-third to one-half the water used by top-loaders.
Front-loaders usually use less detergent than top-loaders.
Front-loaders can spin at 1 400 RPM while top-loaders have a maximum speed of about 1140 RPM. Spin speeds and drum diameter determine the g-force working to removes residual water, allowing textiles to dry faster. This also reduces energy consumption if clothes are dried in a clothes dryer.
Front-loaders have longer cycle times than top-loaders, in part because top-loaders emphasize operational simplicity.
Front-loaders use paddles in the drum to repeatedly pick up and drop clothes into water for cleaning. Top-loaders use an agitator or impeller mechanism to force enough water through clothes to clean them effectively. Washing with a front-loader decreases fabric wear compared to a top-loader. One proxy for clothes wear is the amount of link found on a clothes dryer lint filter. Lint is largely stray fibers detached from textiles during washing and drying.
Top-loaders have greater difficulty cleaning large items, because these items may float on top of the wash water rather than circulate within it, and the agitator may damage delicate fabrics.
Front-loaders are usually quieter than top-loaders, because the door seal insulates noise. They usually have fewer problems with load imbalance. Top-loader agitators usually require the use of a mechanical transmission, which generates more noise than front- loader drives.
Front-loaders can be installed underneath counters. They can also be stacked with a clothes dryer to reduce their area usage.
Front-loaders have simple motor drives in contrast to top-loaders that use mechanical gearboxes with increased wear and maintenance needs.
There are some areas where top-loaders are better than front-loaders.
Front-loaders usually have a higher initial cost than top-loaders.
Front-loaders are more prone to leakage. On top-loaders, gravity keeps water inside the drum. Previously, front-loaders require a gasket/ seal on the front door. This door had to be locked while operating to prevent it from opening. Many modern front-loaders use so little water that they can be stopped mid-cycle for addition or removal of laundry. The water level is below the door level.
Top-loading washers are more tolerant of maintenance neglect.
All washing machines should have access to a a floor drain or an overflow catch tray with a drain connection.
Front-loaders are more convenient for short people and those with mobility issues. Controls are front-mounted and the horizontal drum eliminates the need for standing or climbing.
Risers, often with storage drawers underneath, can be used to raise the height of a front-loader so that it is at the user’s preferred heigh level.
Combined washer dryers, provide washing cycles and drying cycles in the same drum.
The need to transfer wet clothes from a washer to a dryer is eliminated.
It encourages overnight cleaning, which may help optimize power usage during low priority periods, but may increase the risk of fire.
Drying uses more energy than using two separate devices, because a combo washer dryer not only must dry the clothing, but also needs to dry out the wash chamber itself.
A combo machine can be fitted into the same space as a single machine, allowing it to be used where the lack of space is an issue.
The washer may have a larger capacity than the dryer, decreasing their functionality.
A laundry centre is an appliance with two drum units. It is found in two varieties, one where the dryer and washer are beside each other, and the more common situation, where the dryer sits on top of the washer. Laundry centers have a single control panel for both units.
In the course of the coming years, I hope to look at many different household appliances, from a Domotics perspective. This includes revisiting washing machines, when I have learned more about how they will be operated. My most obvious limitation is my status as spectator, rather than participant, in the laundry cleaning process. To remedy this, I am going to ask reader with insights into washing clothes, to answer some open-ended, even vague, questions. These are preferred because they might actually lead to some insights.
Tell me something/ anything/ everything about washing textiles that you want me to be concerned about!
Tell me about yourself. What is your status? Participant/ Spectator/ Other (please specify). Are you female/ male/ non-binary/ a human bean? What is your approximate age? Have you ever had training to wash textiles? For example, did you learn anything at school about it?
How are your laundry facilities organized? A separate laundry room/ part of a bathroom/ part of another room/ other (please specify). What type of equipment do you use for washing? Comments. What type of equipment do you use for drying? Comments.
Do you use services outside of your residence to clean textiles? What services do you use? How often? Why? What are the challenges?
Do you use the services of non-family members, who come into your residence to clean textiles? What services do they provide? Why do you use them? What are the challenges of using them?
What are your textile washing objectives? Do any of them involve general dirt removal, stain removal, killing microbes, eliminating odours, adding odours, neutralizing odours, preventing shrinkage, preventing colour bleeding, increasing garment longevity, reducing microfibre release in wastewater, reducing operating costs, reducing energy consumption?
What are your textile washing behaviours? For example: What temperature settings do you use? What cleaning chemicals do you use? What additional chemicals do you use? Do you have any special procedures used in particular situations?
What would you like to automate in terms of laundry processes? Some choices include: start time, finish time, determining washing cycle, determining detergent quantities, automatic dosing of chemicals, other (please specify).
Washing machine characteristics. Do you use a front-loader or top-loader, please specify brand, model and approximate age (I will try to find out other details from that). What do you like about this machine? What do you dislike about it? When do you expect to replace this machine? What do you think will be the reason for its replacement?
Future washing machine characteristcs. What characteristics are missing in your current machine would you prioritize in your next machine? What characteristics are present in your present machine that you would avoid in your next machine?
Did you want to tell me something else?
Abstract from Cotton et al.
The global impact of laundering clothing is significant, with high levels of water, energy use, and pollution associated with this consumer care process. In this research, the impacts of washing temperature and washing time on garment colour loss (dye fading), colour transfer (dye staining), and microfibre release were evaluated using retail consumer clothing. Significantly greater colour loss and greater colour transfer were observed for a 40 °C, 85 min wash cycle compared to cold-quick (25 °C; 30 min) cycle. Desorbing dyes were found to mainly be reactive dyes. From fundamental kinetic studies, it was observed that significant increases in both rate of dye desorption and total dye desorption occurred when increasing from 20 °C to 40 °C, but the difference in dye release between 40 °C and 60 °C was not as significant; the same kinetic trends were observed for dye transfer. Microfibre release was significantly greater for the 40 °C, 85 min cycle in comparison with the cold-quick cycle, and this effect continued with further washes. These results mean that reducing time and temperature in laundry could have a significant impact in terms of extended garment longevity and reduced dye and microfibre liberation into the environment, in addition to energy savings.