"Unleash your creativity and unlock your potential with MsgBrains.Com - the innovative platform for nurturing your intellect." » English Books » "Counting'' by Benjamin Wardhaugh

Add to favorite "Counting'' by Benjamin Wardhaugh

Select the language in which you want the text you are reading to be translated, then select the words you don't know with the cursor to get the translation above the selected word!




Go to page:
Text Size:

Hollerith’s machine was a genuine mechanical counter – not a machine for doing arithmetic, but one whose sole purpose and capability was to answer the question: how many? No human being had to pay individual attention to the items counted; they could simply be fed into the machine and a count accessed some time later, in the form of a number symbol indicated by the machine. The heart of the counting process – repeated attention plus keeping track – was carried out by the machine itself, raising an interesting question about how or whether a machine can ‘pay attention’ to anything. For the purpose of the census, the items counted were people as represented by punched pieces of manila card, but it was clear that almost anything could be counted by similar means. This was a genuine novelty in the history of counting, and one that, naturally, proved capable of enormous development over the subsequent decades. It is, perhaps, no great surprise that counting machines went on to change the world.

Machines of substantially the original design continued in use for processing the next US census, but from 1910 the Census Bureau ran its own machine shop to manufacture similar devices rather than rent from Hollerith at $1,000 per year for each machine. Hollerith’s designs, meanwhile, were taken up for census work and other statistical tasks in a succession of countries including Austria, Canada, Norway, Russia, France and Italy. By 1912, an admirer could speak, admittedly with some exaggeration, of their adoption by ‘nearly every civilized country’.

Later improvements included a system to record decimal numbers on the cards – using a choice of ten holes for each digit – and machinery to add up numbers so stored. Subtraction, multiplication and the means to represent negative numbers followed in due course, as did a device to feed cards into the machine automatically, doing away with much of the need for dextrous human operators. Special control cards could tell the machines to stop, and printing devices could automatically print the totals so that humans did not need to transcribe them.

Hollerith sold his limited company in 1911, staying on as a consultant; it became part of the Computing-Tabulating-Recording Company, which in the following decade changed its name to the International Business Machines Corporation. For much of the twentieth century, IBM would be one of the world’s largest data-processing companies, supplying equipment to governments and businesses around the world.

There is a disastrous epilogue to the 1890 census itself. In a departure from the practice of the US census up to 1870, no copies were taken of the census returns before they were sent to Washington. One reason was to save labour, and to reduce the burden on local administrations of producing and storing literally tons of paper forms. Another was the belief that the punched cards themselves constituted an adequate copy of the data: one official wrote optimistically that the original schedules ‘might every one of them be burned up, and the Eleventh Census could be taken over again from beginning to end, by means of the little slips of manilla’: that is, the punched cards. In fact, that was not the case, since the names of individuals were not recorded on the cards, only reference numbers denoting the original schedules where they could be found: data was irreversibly lost during the transcription onto punched cards. A third reason was that some felt the original data itself no longer had value once it had been counted, reduced to aggregate statistics and those statistics published: an ill-omened side effect of too-efficient counting practices, perhaps.

By March 1896, an accidental fire had already destroyed portions of the original 1890 census schedules. The bulky remainder were still in the portfolios in which they had been submitted, tied up with twine and piled flat in racks. Limited funds were available for their preservation, and together with 180 tons of punched cards, they were stored in poor conditions in a basement in Washington. On 10 January 1921, a five-alarm fire tore through the building. There were no human casualties, but four hours of fire and the jets from twenty hoses damaged perhaps two-thirds of the 1890 records beyond repair.

The remainder were disposed of as useless in the 1930s, missing out very narrowly on the advent of new attitudes to records management and the preservation of historical documents. The cornerstone of the American National Archives building was laid just a day before the disposal was authorised, and a programme to photostat and bind old census records was under way within the year. It was too late for the 1890 schedules and cards, and with their destruction a permanent, much-lamented gap was created in the records of American genealogy and social history.

What of the Japanese census machine, based on or inspired by Hollerith’s long-lived invention? By February 1904, Japan was at war with Russia, in a conflict that continued for a year and a half, with huge consequences for the government’s priorities and budgetary allocations. The census was postponed indefinitely, and a national population survey of Japan was not taken again until 1920, by which time Ichitaro’s prototype machine was obsolete, perhaps forgotten.

Today it rests in Tokyo, in the Statistical Museum of the Statistics Bureau of Japan, alongside other early computers and tabulators. Who were the people this machine counted, during its brief period of use as a prototype? It seems impossible to say.

 

 




Sia Yoon: Counting likes

South Korea; spring 2022. Sia Yoon (five feet tall, blood type A, born on 15 March and now seventeen years old) is woken by her digital alarm clock, sits up in bed, stretches. Turns on music (Schubert), showers, dresses, leaves her tiny apartment for Dara High School.

Half a million people follow Sia’s morning routine, because she is a successful vlogger. Aged seven, she was on TV as a martial arts prodigy, and she has never looked back. She calls herself a ‘natural-born cutie’ (‘Yes, yes! Be jealous of me!’), and the further reaches of stardom now beckon. A million followers is a goal, and she is starting to produce commercially sponsored posts: for lip tint, for tote bags (the tote bag post got 24,318 likes).

Sia’s day continues with classes, lunch, meetings with friends. She posts photos, checks the responses: how many views, how many likes (or dislikes). Checks her ranking (‘Of all Dara High School students, I rank in the top 1% for the number of subscribers!’)

It’s not all good news. Sia recently broke up with her boyfriend and vlogging partner of two years. There was naturally an element of personal despair about her response, but she’s also deeply worried about the effect on her online presence. Her photo post announcing the breakup got 155,205 likes in a few hours, but her subscribers fell away in their tens of thousands over the subsequent days. Her life at the moment is dominated by the search for ways to remedy the situation.

Sia Yoon, star of NewTube.

webtoon.com.

In the decades either side of 2000, culture was transformed in many countries by a set of processes loosely grouped as ‘digital’. The hardware involved descended from three sources.

On the one hand, there were mechanical calculating machines, whose history goes back to the seventeenth century if not before. By the close of the nineteenth century, desk calculators that could perform the four arithmetical operations were commonplace, and they remained so until the middle years of the twentieth. Second, there were mechanical counting and calculating devices like Hollerith’s, using punched cards to store data, and manipulating the cards to perform various operations upon that data. Third, the typewriter and – even more common at one time – the cash register: input devices more user-friendly than laboriously punching holes in cards.

In the 1930s, it was already becoming feasible to assemble multiple punched-card machines to perform and control complex sequences of operations automatically, for applications in fields such as astronomy and engineering. IBM marketed a ‘card programmed calculator’ in the 1940s and 1950s that was already effectively a computer in the sense of being programmable and able to store, retrieve and process data. Innovations in the basic hardware followed – vacuum tubes, transistors, integrated circuits – leading over a few decades to smaller, faster and cheaper machines. Punched cards, as a means to store data and control machines, were similarly replaced by magnetic tapes and floppy discs. But the basic concepts of data storage, data manipulation and program remained central to what a computer was.

A vacuum-tube computer in the 1940s and 1950s filled a room or even a building. Their names – they were rare enough to have individual names – now sound like quaint jokes: ‘Colossus’, ‘Whirlwind’. Yet, in the mid-1950s, one of them – the ENIAC at the University of Pennsylvania – could plausibly claim to have done more arithmetic in its eleven-year life than humanity had done in the whole of its previous history.

The uses of computers at first remained limited to industrial, military and scientific calculation, and to tasks of the bookkeeping type: airline reservation systems, human-resources administration and stock control. In the 1960s, there were perhaps 10,000 computers in the world. The following two decades saw the rise of affordable computers for individuals, and a software market sprang up, focused in particular on word processing, spreadsheets and databases. User interfaces became graphical rather than text-based in the late 1980s. Meanwhile, microchips found an increasing range of uses outside computers: in cash machines, for instance, in barcode scanners, in digital watches, in arcade games.

Initially, computers were isolated devices, but that, too, gradually changed through the 1960s and 1970s as new capabilities were devised. First, to enable remote working from multiple terminals within a single university campus, all linked to one central computer. Next, to network together multiple computers, whether in local networks with dedicated wiring or – using existing telephone lines – across continents and beyond. Once protocols had been standardised for securing a network and exchanging data over it, commerical online services took off rapidly from the mid-1980s. In 1990, there were just over 300,000 computers on the internet; a decade later there were 100 million.

Email was the crucial application in the initial stages of the internet’s popularisation, closely followed by the World Wide Web and by social media platforms which hosted, aggregated, sorted and sifted ever-increasing oceans of user-generated content. By the 2020s it was becoming difficult to think of a social or economic function that could not be done online: shopping, meeting, gaming, banking … The growth of the internet coincided with the development of more and more portable devices – laptops, smartphones, tablets and smart watches – and drove the spiralling demand for them.

Since the 1930s, science-fiction authors had speculated and fantasised about the global sharing of information, under evocative titles such as the ‘world brain’, the ‘memex’ and the ‘noosphere’. In the first decade of the twenty-first century, it seemed rather as though those dreams had come true, with both their positive and their negative consequences. For some, the internet created a radically new form of culture, a ‘consensual hallucination’, a digital playground in which individual perception, participation and bricolage would replace the old modes of cultural production. An ever-wider range of cultural artefacts were now either being ‘born digital’ or were rapidly being converted into digital surrogates for online distribution. Online identities were freely malleable, online freedoms unconstrained.

At the same time, it was conspicuous that some parts of human experience were being left behind because they resisted being transformed into data, and that some people – indeed, most people – were being left out of the digital revolution because of the accidents of wealth and access to technology. The new, participatory online culture was seldom – if ever – as open, as unconstrained or as democratic as it at first seemed, and it facilitated negative behaviours – theft, lying and bullying – just as often as positive ones. It also facilitated spying, coercion and control, sometimes in new and alarmingly powerful forms. In the new realm of online data, it often felt as though the tasks of navigation and quality control had been outsourced to end users ill-equipped to perform them: that what had been created was an unstable, unreliable post-truth society in which what mattered was how many followers you could get to believe you, not whether what you said bore any relationship to an offline reality.

All of this has everything and nothing to do with the history of counting. What is digitisation if it is not, in a sense, ‘counting’? (The use of ‘digit’ – literally a finger – to mean a number or a number symbol seems to derive from Roman and medieval finger-counting.) What is online culture if it is not the sheer profusion of modern ways of counting, the participatory spectacle of a world turned into pure numbers? Computers were inherently digital from the days of vacuum tubes, consisting of electrical components whose state was either on or off, conventionally interpreted as 1 and 0 in a binary representation of numbers.

Yet even by the early years of the twentieth century, punched-card machines were performing tasks very much more complicated than just counting. It seems forced to describe, say, the mid-century code-breaking devices as mere ‘counting machines’, and frankly bizarre to insist that that is what a smartphone really is. Indeed, once the function of counting has become detached from having a human being pay sequential attention to things or events and keep track while doing so, machine counting quickly begins to feel very unlike other kinds of counting. The interesting questions shift to its capability for transforming people into punched cards, paintings into on-screen images, videos into advertising revenue: and away from the fact that those transformations are mediated by binary numbers.

That said, microchip devices are also used for functions much more like old-fashioned counting: as a substitute, when asking how many, for counting in words, on your fingers, or using some other device. One of the most conspicuous is the counting of friends, likes, followers and views in social media contexts. Here, the digital world has provided not only a new technology with which to count, but new things to count.

All of these issues play out in microcosm in the life of Sia Yoon. She lives in South Korea, a country whose industrialisation and economic growth have involved conspicuous success in the electronics industry, from semiconductor components to phones and watches, and also successive phases of cultural export – the ‘Korean Wave’ – facilitated and driven by online marketing and online presence.

K-pop leads the way, with the most popular artists and videos gaining countless followers online. By December 2014, Psy’s video ‘Gangnam Style’ was on track to ‘break YouTube’, as the number of views approached the largest that can be stored using thirty-two binary digits (rather more than 2.1 billion, assuming you use the first digit to record whether the number is positive or negative), neatly illustrating that every way of counting has its limits. YouTube upgraded its hit-counter to use sixty-four digits instead, and the problem was averted. Games, films, television and of course devices by Samsung and LG are also major exports. Korea is one of the most wired countries in the world in terms of broadband and wireless connectivity and ownership of digital phones, cameras and other devices.

After her breakup, Sia found a new boyfriend fairly quickly, and started a vlogging project that involved making him over. Minho (five foot ten, blood type AB, birthday 17 August) was, to begin with, a nerdy devotee of Japanese animé: overweight, unhealthy and uncharismatic. His best friend was shocked at the very idea of liking a girl who lived ‘outside the monitor’. But Sia was convinced that this rediscovered childhood crush could be transformed: specifically over the course of one hundred days, which she diligently counted down on camera as her followers watched. The project was a winner on social media: her initial announcement was viewed 1.59 million times in the first day, receiving 27,000 likes (and few or no dislikes), and she went on to vlog in detail about Minho’s newly imposed exercise and diet regime (‘W-wait … But … There’s nothing but leaves and sweet potatoes here …’).

His first workout video got 250,000 views in a day; a weigh-in 880,000, with 3,000 comments. As his appearance improved, the couple’s first selfie together gathered 20,000 likes, and a clip from their skiing trip went viral and attracted 1.6 million views. And so on, and so on. The project succeeded in terms both of Minho’s health and appearance (his weight eventually dropped to 169 pounds), and of subscriptions, which rose to over 650,000. As one supporter put it, ‘you’ve become an icon of hard work and effort, and everyone loves you for it’. Yet Sia would never lose sight of the fact that ‘what’s important is the view count’.

Are sens

Copyright 2023-2059 MsgBrains.Com