Antipatterns are common practices that initially appear to be beneficial, but ultimately result in bad consequences that outweigh hoped-for advantages. The term, coined in 1995 by programmer Andrew Koenig, was inspired by a book, ‘Design Patterns,’ in which the authors highlighted a number of practices in software development that they considered to be highly reliable and effective.
The term was popularized three years later by the book ‘AntiPatterns,’ which extended its use beyond the field of software design and into general social interaction and may be used informally to refer to any commonly reinvented but bad solution to a problem. Examples include analysis paralysis (over-analyzing a situation while indefinitely delaying making a decision), cargo cult programming (the ritual inclusion of code that serves no real purpose), death march (pressing ahead on a project members feel is destined to fail), groupthink (a desire for harmony in the group results in an irrational decision-making outcome), and vendor lock-in (preventing customers from seeking alternatives).
read more »
Antipattern
Lindy Effect
The Lindy Effect is a theory of the permanence of non-perishable things. Unlike biological organisms, the life expectancy of an idea or technology increases as it ages. The origin of the concept can be traced to biographer Albert Goldman and a 1964 article he wrote for ‘The New Republic’ titled ‘Lindy’s Law.’ In it he stated that ‘the future career expectations of a television comedian is proportional to the total amount of his past exposure on the medium.’ The term refers to a NY deli known as a hangout for comedians; they would ‘foregather every night at Lindy’s, where… they conduct post-mortems on recent show biz ‘action.’
Mathematician Benoit Mandelbrot formally coined the term ‘Lindy Effect’ in his 1984 book ‘The Fractal Geometry of Nature.’ Mandelbrot expressed mathematically that for certain things bounded by the life of the producer, like human promise, future life expectancy is proportional to the past: ‘However long a person’s past collected works, it will on the average continue for an equal additional amount. When it eventually stops, it breaks off at precisely half of its promise.’
read more »
Sentience Quotient
The sentience [sen-shuhns] quotient [kwoh-shuhnt] (SQ) was introduced by nanotechnology researcher Robert A. Freitas Jr. in the late 1970s. It defines sentience as the relationship between the information processing rate (in bits per second) of each individual processing unit (neuron), the weight/size of a single unit, and the total number of processing units (expressed as mass). This is a non-standard usage of the word ‘sentience,’ which normally relates to an organism’s capacity to perceive the world subjectively (it is derived from the Latin word ‘sentire’ meaning ‘to feel’ and is closely related to the word ‘sentiment’; intelligence or cognitive capacity is better denoted by ‘sapience’).
The potential and total processing capacity of a brain, based on the amount of neurons and the processing rate and mass of a single one, combined with its design (e.g. myelin coating, specialized areas) and programming, lays the foundations of the brain level of the individual. Not just in humans, but in all organisms, even artificial ones such as computers (although their ‘brain’ is not based on neurons). The SQ of an individual is therefore a measure of the efficiency of an individual brain, not its relative intelligence.
read more »
Ecophagy
Ecophagy [ih-koh-fuh-jee] is a term coined by molecular engineering scientist Robert Freitas that means the literal consumption of an ecosystem. He wrote: ‘Perhaps the earliest-recognized and best-known danger of molecular nanotechnology is the risk that self-replicating nanorobots capable of functioning autonomously in the natural environment could quickly convert that natural environment (e.g., ‘biomass’) into replicas of themselves (e.g., ‘nanomass’) on a global basis, a scenario usually referred to as the ‘grey goo problem’ but perhaps more properly termed ‘global ecophagy.”
The term has since been used to describe several other world destroying events including nuclear war, catastrophic monoculture (lack of biodiversity in farming), and mass extinction due to climate change. Scholars suggest that these events might result in ecocide in that they would undermine the capacity of the Earth’s biological population to repair itself. Others suggest that more mundane and less spectacular events—the unrelenting growth of the human population, the steady transformation of the natural world by human beings—will eventually result in a planet that is considerably less vibrant, and one that is, apart from humans, essentially lifeless.
The Machine Stops
‘The Machine Stops’ is a science fiction short story written in 1909 by E. M. Forster, who known for his ironic and well-plotted novels examining class difference and hypocrisy in early 20th-century British society. After initial publication in ‘The Oxford and Cambridge Review,’ the story was republished in Forster’s ‘The Eternal Moment and Other Stories’ in 1928. It is particularly notable for predicting new technologies such as instant messaging and the Internet.
Forster describes a world in which most of the human population has lost the ability to live on the surface of the Earth. Individuals lives in isolation below ground in a standard ‘cell,’ with all bodily and spiritual needs met by the omnipotent, global ‘Machine.’ Travel is permitted but unpopular and rarely necessary. Communication is made via a kind of instant messaging/video conferencing machine called the speaking apparatus, with which people conduct their only activity: the sharing of ideas and what passes for knowledge.
read more »
Fordite
Fordite, also known as ‘Detroit agate,’ is old automobile paint which has hardened sufficiently to be cut and polished.
It was formed from the built up of layers of enamel paint slag on tracks and skids on which cars were hand spray-painted (a now automated process), which have been baked numerous times. In recent times the material has been recycled as eco-friendly jewelry.
Circular Reporting
In source criticism, circular reporting or false confirmation is a situation where a piece of information appears to come from multiple independent sources, but in fact is coming from only one source. In most cases, the problem happens mistakenly through sloppy intelligence gathering practices, but in a few cases, the situation is intentionally caused by the original source. This problem occurs in variety of fields, including intelligence gathering, journalism, and scholarly research. It is of particular concern in military intelligence because the original source has a higher likelihood of wanting to pass on misinformation, and because the chain of reporting is more liable to being obscured.
Wikipedia is sometimes criticized for being used as a source of circular reporting, and thus advises all researchers and journalists to be wary of using it as a direct source, and to instead focus on verifiable information found in an article’s cited references. In 2008 an American student edited wikipedia in jest, writing that the coati (a small mammal in the raccoon family) was ‘also known as….the Brazilian aardvark,’ resulting in many subsequently citing and using that unsubstantiated nickname as part of the general consensus, including an article in ‘The Independent.’
Cyanometer
A cyanometer [sahy-uh-nom-i-ter] (from cyan and -meter) is an instrument for measuring ‘blueness,’ specifically the color intensity of blue sky. It is attributed to Swiss aristocrat, physicist, and mountaineer Horace-Bénédict de Saussure. It consists of squares of paper dyed in graduated shades of blue and arranged in a color circle or square that can be held up and compared to the color of the sky. The blueness of the atmosphere indicates transparency and the amount of water vapor.
De Saussure is credited with inventing a cyanometer in 1789 with 53 sections, ranging from white to varying shades of blue (dyed with Prussian blue) and then to black, arranged in a circle; he used the device to measure the color of the sky at Geneva, Chamonix and Mont Blanc. He concluded, correctly, that the color of the sky was dependent on the amount of suspended particles in the atmosphere.
Roller Skates
Roller skates are devices worn on the feet to enable the wearer to roll along on wheels. The first roller skates were converted ice skates, with two inline wheels instead of a blade. Later the ‘quad’ style of roller skate became more popular consisting of four wheels arranged in the same configuration as a typical car.
The first patented roller skate was introduced in 1760 by Belgian inventor John Joseph Merlin. His inline two wheelers were hard to steer and hard to stop because they didn’t have brakes, and as such were not very popular. In 1863, James Plimpton from Massachusetts invented the ‘rocking’ skate using a four wheel configuration for stability, and independent axles that turned by pressing to one side when the skater wanted to create an edge. It was a vast improvement on the Merlin design and was easier to use, driving the huge popularity roller skating through the 1930s. The Plimpton skate is still used today.
read more »
Common Carrier
Common carrier is a legal term for a company that transports goods or people and is responsible for any loss in transit. Such services are offered to the general public under license or authority provided by a regulatory body, which may create, interpret, and enforce its regulations upon the common carrier (subject to judicial review) with independence and finality, as long as it acts within the bounds of the enabling legislation.
A common carrier is distinguished from a contract carrier, which transports goods for only a certain number of clients and that can refuse to transport goods for anyone else, and from a private carrier (a company that transports only their own goods). A common carrier holds itself out to provide service to the general public without discrimination (to meet the needs of the regulator’s quasi judicial role of impartiality toward the public’s interest) for the ‘public convenience and necessity.’
read more »
Indoor Positioning System
An indoor positioning system (IPS, or micromapping) is a network of devices that wirelessly locate objects or people inside a building. Instead of using satellites like GPS, it relies on nearby anchors (nodes with a known position), which either actively locate tags or provide ambient location or environmental context for devices. Systems use optical, radio, or even acoustic technologies. Integration of data from several navigation systems with different physical principles can increase the accuracy and robustness of the overall solution.
Wireless transmission indoors faces several obstacles including signal attenuation caused by construction materials, multiple reflections at surfaces causing transmission errors, and interference with devices that emit or receive electromagnetic waves (e.g. microwave ovens, cellular phones). Error correction systems that don’t rely on wireless signals are being used to compensate for these shortcomings, such as Inertial Measurement Units (reports velocity, orientation, and gravitational forces using accelerometers and gyroscopes), and Simultaneous Localization and Mapping (SLAM, a technique used by robots and autonomous vehicles to build up a map within an unknown environment).
Wireless Mesh
A wireless mesh network (WMN) is a communications network made up of radio nodes organized in a mesh topology; each node is connected to one or more other nodes, and information is passed from one node to the next, until it reaches its target destination. As in most cases, there is more than one path from one node to another, making such networks very reliable. When a node fails, the data will simply take another route. This type of infrastructure can be decentralized (with no central server) or centrally managed (with a central server).
The coverage area of the radio nodes working as a single network is sometimes called a mesh cloud. Mesh architecture sustains signal strength by breaking long distances into a series of shorter hops. Intermediate nodes not only boost the signal, but cooperatively make forwarding decisions based on their knowledge of the network, i.e. perform routing. Such an architecture may with careful design provide high bandwidth, spectral efficiency, and economic advantage over the coverage area.
read more »




















