The Information Age

Evolution of Computing and Communications

Since the dawn of time, as we looked at the numberless stars in the sky above our heads, we have sought to make sense of the world around us by systems of numbering and counting. As social organisation became more complex, so did computation: five centuries BC, the Greeks were using pebbles as counting aids, a system which prefigured the ancient abacus. But it would be more than 2000 years before the process of computation became mechanised.

The methods of calculation that we have come to call computing have built on a long history of development in mechanical calculation, analogue computation, electrical communication, programmed computation, business data processing, electronics and mathematics, especially logic.

Mechanical calculation

Because of the mental effort required, and the all-too-human tendency to make mistakes, there has long been a demand for machines that can ease the strain of routine arithmetic. The first adding machine dates from 1642:

Machines that add...

Aged only 19, but already a famous mathematician, Frenchman Blaise Pascal invents a machine to help his father to do mathematical calculations. The device consists of a wooden box with sixteen dials which can be turned to do simple addition.

...and multiply

In 1672, a young German lawyer, Gottfried Wilhelm von Leibniz, who, as a mathematician will be a co-inventor of the calculus, develops an improved automatic calculator which can perform rapid multiplication or division. Wheels placed at right angles are displaced by a special stepping mechanism. The operator has to understand how to turn the wheels - thus know the machine's "programming language". The "Multo" machines used by students in our Statistics Department until the 1960s are a direct descendant.

Calling the odds

In the pre-electronic age, the most complex calculating machines are designed to determine the "odds" on horse races. The first such machine in the world is installed in Auckland at the Ellerslie race course in 1913 by the Sydney-based company that would become Automatic Totalisators Ltd. The automatic totalisator – or "Tote" – is designed by George Julius, the son of Bishop Julius of Christchurch.

Analogue computation

Analogue calculators - most commonly slide rules - use physical quantities to represent numbers approximately.

The Meccano differential analyzer

Hartree's meccano differential analyzer built on the idea of so-called integrating machines which had been developed in the US but were adopted in the UK by Douglas Hartree. One of Hartree's machines was brought to Auckland in 1950 by Professor Harry Whale for use at the Seagrave Radio Research Centre. This is now on display at the Museum of Transport and Technology at Auckland.

Programmed computation

Solving problems numerically required computations involving thousands of steps. Extensive calculations were performed by hand, or by using simple machines, with large teams of people involved. The people who calculated were called "computors" and the instructions they followed was their "programme" of work.

Analytical engine

English inventor and mathematician Charles Babbage is the first to see the need for the programming of computation to be mechanised. His Difference Engine, designed in the 1820s, is specifically for building mathematical tables but his later Analytical Engine, developed from the 1840s on, is a general-purpose machine with a program punched on cards. "I wish these calculations had been executed by steam", he told a colleague. Ada, Countess of Lovelace, studies Babbage's designs and comments that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves.

Pioneer venture

L. J. Comrie, a New Zealander from Pukekohe, graduate of this University, who is regarded as an important pioneer in computation, opens the world's first commercial computing bureau in London in 1938. His bureau, Scientific Computing Service Ltd., uses adapted electromechanical machines for interpolating tables of data, particularly for military use in World War II.

Window display

By the end of the Second World War many externally programmed machines have been developed. Perhaps the most impressive is IBM's Selective Sequence Electronic Calculator (SSEC) which is installed in a shop window in New York's Madison Avenue in 1946.

Long-distance communication

Communication media, firstly the telegraph, then telephone and radio, have all been adapted for use for communication to, and between, computers. Indeed, telephone circuits are still the most-widely used means of connecting the home computer to the internet. Television screens provided the first technology for humans to communicate visually with computers.

1837

The world by wire

The telegraph, patented in 1837, is quickly commercialised. The writer Samuel Butler, whose life in New Zealand in the 1860s inspired his novel Erewhon, greets the installation of the telegraph link between Christchurch and Lyttelton with a letter published in the Christchurch Press and signed "Lunaticus", which predicts what we now call the Internet and world wide web:

"We will say then that a considerable advance has been made in mechanical development, when all men, in all places, without any loss of time, are cognizant through their senses, of all that they desire to be cognizant of in all other places, at a low rate of charge, so that the back country squatter may hear his wool sold in London and deal with the buyer himself - may sit on his own chair in a back country hut and hear the performance of Israel in Egypt at Exeter Hall - may taste an ice on the Rakaia, which he is paying for and receiving in the Italian opera house Covent Garden. Multiply instance ad libitum – this is the grand annihilation of time and place which we are all striving for ...."

Butler continues to be concerned about the possibility of machines evolving consciousness, a theme that he explores in his study "The Book of the Machines".

Business data processing

The use of machines for storing and processing business data started in the late 19th century. Cash registers from NCR (National Cash Register, later absorbed into AT&T) appeared in 1884 and calculators by Burroughs (which would later merge with UNIVAC to form Unisys) in 1886. Later, data was stored on cards and magnetically. These electro-mechanical and punched card book-keeping machines were in use until the 1970s. The main task of earlier computers was to take over punched card data-based processing from these older technologies.

Punched cards

Herman Hollerith develops punched-card equipment for the 1890 US census. This is the origin of the use of punched cards in data processing.

A giant is born

Thomas J. Watson, a salesman with NCR, founds CTR (the Computing- Tabulating-Recording Company) in 1911, building on Hollerith's business. It changes its name to International Business Machines in 1920. IBM will come to dominate punched card data processing and, later, the computer business.

Logic

Development of Mathematical Logic over the years gives us the tool to deal with complexity in the design of computer hardware and software. Resolution of questions regarding the logical basis of Mathematics itself leads to an understanding of the capabilities of computers before they even exist.

Boolean algebra

George Boole (1815-1864), an English mathematician at Queen's University in Cork, Ireland, in his "Laws of Thought" develops a variety of algebra that applies to variables that are restricted to two values, true and false, with the arithmetic operators + and × replaced by the logical operators "or" and "and." His contribution is recognised in the "boolean" variables of programming languages such as Java.

A switch in time

Charles Peirce, an American philosopher, recognises that the operators of Boolean algebra can be implemented in circuits using switches. Claude Shannon, in his 1938 MS thesis at MIT, shows how Boolean algebra can be applied to the design of circuits involving switches. Since then, all computers have been designed using logical "gates" constructed from switches, at first using electromechanical relays then, when developed, electronics.

Electronics

Electronic devices operate at speeds thousands of times greater than mechanical ones. As vacuum-tube technology improved in the first half of the 20th century it was applied to calculation and was a proven technology when needed for computers following World War II.

The first vacuum tube

In 1906, Lee de Forest invents the triode electronic switch/amplifier that is immediately applied to develop radio. In 1919 the Eccles Jordan flip/flop shows how electronic switches can be used to store data by "toggling" between positions according to their inputs.

1940

Breaking the code

During World War II special machines are used to assist with breaking the codes of German communications. At Bletchley Park in Buckinghamshire, the "Bombe" machines designed by Alan Turing are used to break the cyphers of the Enigma machine. The most significant machine is the electronic Colossus, developed in 1943 to decrypt the "Tunny" teleprinter codes used by the German high command. Colossus is the first large-scale electronic computing machine.

Military muscle

At the University of Pennsylvania a large-scale electronic computing device called ENIAC is developed for the military under the direction of Presper Eckert and John Mauchly. It is tested in 1945 by extensive calculations involved in the development of the H-bomb.

Steps towards the first stored program computers

Here we cover the period, centered on the late 1940s, before the first modern computers were constructed. Interestingly, what computers could do and how they could be designed, were well understood before the machines themselves appeared.

The stored program

Electronic calculation was much faster than mechanical. However, electronic calculators were being held to a fraction of their potential because of the way they were controlled by either wiring of plugboards or obeying programs read mechanically from a sequence of instructions on cards - plugboards were too inflexible and card readers too slow. It became clear that the computer could be "unleashed" only if the program steps were themselves stored within the machine's memory and made available at near-electronic speeds.

The universal computer

In 1936, Alan Turing, a brilliant Cambridge university mathematician, proposes an abstract model of what a computer would be and defines precisely what work it might do. His abstract machine - the Universal Turing Machine - becomes the basis for the theory of computation. The UTM mixes instructions and data in the same memory. Turing goes on to work on cryptanalysis during World War II at Bletchly Park. He designs practical decoding machines including the "bombe" used on the Enigma codes.

Design standard

John von Neumann, a Hungarian-born mathematician at Princeton University, worked with the ENIAC designers on the design of a stored-program successor, the EDVAC. His later computer built at the Institute for Advanced Studies at Princeton University, described in detail in the paper "Preliminary discussion of the logical design of a computing instrument", has great influence throughout the world. "Johnniac" machines will become the standard concept for what a computer is, and their style of design will be known as the von Neumann Architecture.

1949

ACE trumped

Meanwhile, Turing himself starts designing a real computer, the ACE, in a proposal in late 1945 to the UK National Physical Laboratory. His approach is quite different from von Neumann's, wanting a fast machine that is easy to build. The Pilot ACE, is the fastest of the early British computers, but it is delayed until 1950. Turing's 1945 report is brilliant in its breadth, from the details of technology, through the design, to possible applications - including playing chess! He understood the importance of programming ("constructing instruction tables") to the use of computers, when he wrote:

This process of constructing instruction tables should be very fascinating. There need be no real danger of it ever becoming a drudge, for any processes that are quite mechanical may be turned over to the machine itself.

The baby boom

The post-war years see many projects to build working, usable computers based on the von Neumann model. A "baby" computer with 128 bytes of memory is operating at the University of Manchester in 1948 and there is a definitely usable EDSAC computer at Cambridge University in 1949. The main technological problem of the first computers is illustrated by these two British machines. The Manchester computers use memories that store data on the surface of cathode ray tubes - these are fast, but unreliable. The Cambridge computer uses a recirculating memory. These memories are cheaper and reliable but serial (accessing one bit at a time) and slow. Recirculating memories, which use sound traveling in tanks of mercury (as in EDSAC) or along wires, or data stored on magnetic drums, are common in the work-horse computers of the 1950s. Computing as we understand it awaits the invention of better systems of memory.

Computers & Technology

Technology benefits from miniaturisation because the smaller a device is, the cheaper are its manufacture and operation. Also, with electronic devices, smaller distances make for faster circuits. The first 20 years of computing saw electronic switches reduced from large valves to tiny transistors. Since the late 1960s, electronics has used integrated circuits "etched" onto a surface of silicon and circuits have shrunk by a factor of more than a million. This is "Moore's Law" named for Gordon Moore, one of the founders of Intel Corporation, which states that the density of circuits doubles every 18 months; progress shown most strikingly by computer memory chips. Computer circuits have also declined in cost by more than 10 million since 1960. Meanwhile the circuit speed of computers has increased by a factor of "merely" 10,000.

This progress in technology has been the major factor in the spread of the use of computers. To give special attention to technology we have introduced timeline markers of progress, noting every factor of 10 increase in circuit speed and, as an indication of circuit density, every quadrupling of memory chip size.

The 1950s: Developing the technology

The first real computers started work in the early 1950s. They were research projects really, with odd names such as Silliac and Maniac, but they quickly became products as multiple copies of each design were manufactured. Reliable technology – both hardware and software - was phased-in during the decade giving a sound base for growing a thriving industry.

Writing the book

The first manual on programming, written for EDSAC by Wilkes, Wheeler and Gill in 1950 is published as a book in 1951. Before there are books describing the computers themselves, the manual shows the importance of developing techniques for writing computer programs.

For sale

In 1951, computers - Ferranti Mark-1 and UNIVAC - become commercially available. UNIVAC, developed by Eckert and Mauchly, the ENIAC designers, stores data on magnetic tapes rather than cards, giving much greater density and faster access. Ferranti Mark-1 is the commercial development of the University of Manchester Computer.

Play it again - IBM

Although steel magnetic tapes were provided with the Univac, IBM introduces cheaper lighter plastic tapes with the 726 tape drive in 1953 for their 705 computer - an ingenious vacuum buffer separates the movement of the massive reels from the movement of tape. Descendants of these tapes will provide archival storage through to the 1980s.

1953 Hillary and Tenzing climb Everest

1953: 1MHz clock – The Whirlwind

The first computers were thousands of times faster than their predecessors. Although it was possible to operate electronic devices at sub-microsecond speeds, the power of the computer did depend a great deal on the amount of hardware and cost. The Whirlwind, with its 1MHz clock - the fastest of the early computers - was developed by 1953. It led the way in the use of visual displays and was a test bed for the first core memories. It was also the basis for the US SAGE air-defence system later in the 50s.

The database server

IBM creates the 305 RAMAC in 1956 by attaching the first fixed-disk to the IBM 305 computer. The disk has a capacity of about 5 megabytes (MB). It is intended for permanent data storage and is the first use of a computer as a database where the computer is primarily a repository for stored data, with the computer's role reduced to providing access for users of the data.

Speaking the computer's language

The first computers have been programmed directly in binary machine language. Five years of development leads to the wide availability by 1957 of high-level languages which allow computers to be given instructions by their users. IBM develops Fortran, and Univac devises MATH-MATIC and FLOW-MATIC, the latter shepherded by Grace Hopper to become the standard business language Cobol. Algol develops as an international standard for program exchange. Fortran goes through many versions and is still being used. The programs, called "compilers", that translate high-level languages to machine language are one of the most important inventions in modern computing.

Better memory

Core memory, where data is represented by the polarity of magnetization of a tiny toroid, is developed by Jay Forrester at MIT, providing an electronic technology for computer memory that is both reliable and fast. It is adopted by IBM and other manufacturers, and remains the main memory technology until replaced by integrated circuit memories in the mid 1970s.

1959 Auckland harbour bridge opened

Transistors

Although fast, vacuum tube computers are expensive and unreliable. When transistors, developed at Bell Labs in 1947, become widely available in 1958, they are immediately used to make computers and become the circuit technology of choice for succeeding decades. The first new transistorized computers to be sold commercially are from Philco but other manufacturers quickly follow, in many cases redesigning vacuum tube computers using transistors. The IBM 7090 (709T), the archetype computer from the early 1960s which featured in the movie "Dr Strangelove", is one such.

1961

The 1960s: The age of IBM

The computing industry grew rapidly, building on 1950s technology which had improved considerably but not changed fundamentally. Software development became the main focus. Remote access to computers using telephone lines became common. Smaller cheaper computers, particularly those used for controlling machines, became available. Sales of computer hardware and software were dominated by IBM - the other major companies Burroughs, Univac, NCR, Control Data, and Honeywell were known as the BUNCH.

Computing New Zealand

The IBM 650, which starts up in 1961, is the first computer in New Zealand. Installed at Treasury to automate the government payroll, it is old technology - it uses valve circuits and a rotating drum main memory, first introduced in 1954 - but the IBM 650 widely replaces punch card data processing equipment and is a major industry workhorse world-wide. At around the same time, the first British computer in New Zealand, an ICT 1201, is set operating in Shell House, also in Wellington.

Exchange of ideas

In 1960 the New Zealand Computer Society is established. Initially called the "Data Processing and Computer Society", it changes its name in 1968. The society has 200 members by 1965 and runs national conferences from 1968, allowing members to exchange ideas and see exhibitions of recent developments. In 2012 the society is renamed the Institute of IT Professionals (IITP.)

Slicing time

Sharing computing power among multiple users simultaneously - using the computer's speed to give the user the impression of sole control - helps spread the high cost of computers. Following the suggestion of John McCarthy, the first time-sharing system is developed in 1962 for the PDP1 computer manufactured by Digital Equipment Corporation (DEC). The PDP1 computer is even better known as the computer for which the first video game - Space Wars – was developed by Steve Russell. One of the most important developments in time sharing in the 1960s was IBM's VM or CP/CMS time sharing operating system for their System/360 computers. This introduced the idea that the computer, as well as its memory, should be virtual - a concept that has found continued applicability up to the present day.

Virtual memory

As it becomes clear that there is an advantage in having computer systems that interleave the execution of multiple programs, it becomes desirable to ensure that programs do not deal with hardware resources directly. In 1961 the Ferranti Atlas introduces the concept of "virtual memory" which allows programs to be stored anywhere in computer memories and located automatically, regardless of the addresses used in the programs. Other computers, such as those from Burroughs, have different techniques to achieve the same goal. Virtual memory comes to IBM mainframes in 1975 and PCs with the 80486 in 1989.

1962 Public television arrives in NZ

University computers

The University of Auckland and the University of Canterbury install IBM 1620 computers in 1963. Victoria installs an Elliot 503 a short while later in co-operation with the Applied Mathematics Division of the Department of Scientific and Industrial Research (now Industrial Research Ltd). The 1620 is designed to be relatively inexpensive, but it is also very slow. The Auckland 1620 is on display on the first floor of building 303S.

Remote access

The SABRE airline reservation system, developed in 1962 by American Airlines, is the first large-scale transaction processing system. It uses a large, expensive central computer whose database is accessed remotely by phone lines. Users sit at inexpensive terminals with text on screens or even using teletypes. The computer's role is reduced to that of a server.

All-encompassing architecture

IBM's 360 range, which featured computers of various speeds and prices all having the same architecture, is a major conceptual step in 1964. The instruction set architecture is designed to be independent of the model of machine (programs written in 1965 can still be run on IBM servers). Now, a computer system is a range of software services which the hardware supports. With OS360, the modern operating system comes of age.

Sharing power

Computer bureaus that provide computing power to serve multiple users arrive in New Zealand in 1964. In Wellington, Denis Trotman founds Computer Services Ltd., in September, and offers services using IBM computers. CSL grew to operate in the three main centres in New Zealand until closing in 1987. Meanwhile, in Christchurch, a group originating in accounting firms orders an ICT 1902 computer which will be used by Computer Bureau Ltd. (CBL) founded by Bernard Battersby and Paul Hargreaves in July 1965. CBL rapidly establishes computer centres in Wellington, Hamilton and Auckland servicing some of new Zealand's largest companies. It later changes its name to Datacom and grows to be New Zealand's largest homegrown IT company, employing over 3000 people here, in Australia and Asia.

1964: 10 MHz Clock: The super-computer

The CDC 6600, designed by Seymour Cray, was the first "supercomputer". When introduced in 1964, the 6600 with its 10MHz clock, is by far the most powerful computer available for Scientific Computing.

The evolution of core memory

Core memories evolve remarkably from the coarse doughnuts of the 1950s, becoming almost invisible and manufactured by machine rather than by hand. Core will remain a competitive technology until mid 1970s .

1969 Apollo 11 delivers the first men to the moon who land in a lunar module, supported by a computer with an 8K memory!

Integrated circuits

The next quantum leap in circuit technology will be the development of complete circuits - transistors, resistors, capacitors and wires - etched directly on a silicon surface using techniques extended from printing. Integrated circuits, developed through the 1950s, arrive in 1959 but do not come into wide use until the late 1960s. Before ICs, computer circuits were constructed from discrete components in many ingenious schemes.

Computers - you can bank on them

In 1967, New Zealand banks decide to computerise data processing in time for the introduction of decimal currency. They set up Databank, an independent company, to do the job under the initial leadership of CEO Gordon Hogg. Databank grows to operate the largest computer centres in New Zealand. It is able to ensure that any cheque written on any major NZ trading bank will be processed within 24 hours. This co-operation between banks will make for fast introduction of ATMs and EFTPOS in the 1980s. Databank is later taken over by EDS.

"Goto considered harmful"

The main aid to programming at this time is the flow chart. The use of flow chart design coupled with the ability to transfer control anywhere within a program results in programs that are hard to debug and maintain. Dijkstra's 1968 letter counsels giving more attention to the structure of programs, leading to the techniques of structured programming. In 1970 the Pascal Language is developed by Niklaus Wirth and intended for educational purposes, in fact remaining the main educational language for 25 years. The simple unstructured language Basic is also developed for educational use at Dartmouth College in 1966 by John G. Kemeny and Thomas E. Kurtz.

Software advances

Digital Equipment Corporation's PDP11 is a very successful computer range whose architecture is developed at Carnegie-Mellon University. It is very easy to program at the machine language level and will expand into a line of computers that will last more than a decade and be very popular in universities, including this one. In 1969 staff at Bell Telephone Labs in the US develop their own programming language C and their own Operating System called Unix for Digital's minicomputers. Unix flourishes on the PDP11 and will be used on many different systems over the years and is now represented on PCs as Linux. C became a major language for software development.

Small is beautiful

The first departmental mini-computer at the University of Auckland is the Hewlett- Packard 2116 installed in 1969. Mini computers are small, relatively inexpensive computers designed for individual use as machine controllers or departmental computers. One of the most well-known is the PDP-8 of 1965, examples of which are used at Auckland University. The HP2116 is used for controlling our Physics Department's van de Graaf accelerator located on the site of building 303S. Low-cost computers often make use of punched paper tape for input and output. A very common input/output medium for small computers is punched paper tape.

1970 1K RAM chips

The 1970s: Onward & upward

Large computers, now called mainframes, steadily grow bigger and more powerful. Future technologies that will threaten the mainframe are under way.

1970: Memories are made of this

The 1K RAM chip arrives with the IBM System/370 in 1970. This is the first time that integrated circuit memory is used on a wide scale, and it ushers in an age of steady but dramatic development of the technology. We will mark how the RAM chip size increases over the years, though without any images. The density of logic circuits tracks the memory improvement, but at a lower scale. This image is of an Intel 1003 RAM chip.

1971 Satellite communications reach New Zealand via the Warkworth station.

Making it micro

The first microprocessor chip, the Intel 4004, is unveiled in 1971. Initially intended to control an electronic wristwatch, it enables the development of the embedded controller. Today almost all complex devices, from automobiles to electronic zithers, contain one or more microprocessors.

Mainframes and terminals

All New Zealand universities buy Burroughs B6700 computers in a bulk deal in 1972 and have full-scale mainframes for the first time. Most work is performed in the old punched-card batch manner, but remote access through telephone connections is provided for the first time.

The modern disk

The IBM Winchester Disk, released in 1973, is the technology base from which the modern hard disk will be derived. Before the Winchester (model 3040) most databases have been held on removable disk packs. The 3040 is a removable disk but the removable module includes the reading mechanism and, because the device is sealed, it can accommodate much greater density of storage. In practice, the 3340 disks are not often dismounted; the fixed hard disk is born.

Computer Science reaches the Antipodes

In 1973 Computer Science departments are established at the University of Canterbury (Prof John Penny) and Massey University (Prof Graham Tate).

The personal computer

The Altair, a kitset, features on front page of Popular Mechanics magazine in December 1975. This is followed by other "do it yourself" computers in 1976. In the same year Microsoft Corporation is founded, its first product being a Basic compiler for Altair.

1974 4K RAM chips

An oracle's relations

E. F. (Ted) Codd (1923-2003), an English employee of IBM in San Jose, invents the concept of a Relational Data Base in the late 1960s, a systematic way of organising data stored in computer systems. His colleague C. J. Date publishes a series of books popularising the concept, including "Introduction to Data Base Systems" in 1975. Oracle Systems Corporation is founded by Larry Ellison in 1977 to develop and market systems based on the Codd and Date ideas.

Keeping data secure

The invention of public key encryption in 1977 is a response to the need to provide security for private information sent between computers via public communication networks. Whitfield Diffie and Martin Hellman propose an approach for encrypting messages which involves both a public and a private key. A practical version of this idea is the RSA algorithm of Rivest, Shamir and Adleman. Public key cryptography has since become a basic security tool for the Internet, and it underlies digital signatures and electronic financial transactions.

1977 16K RAM chips

The Four Color Theorem proved

The power of a computer to search through large numbers of cases beyond human ability is used in 1977 in resolving a longstanding mathematical proposition, that four colours are adequate to colour any map on the plane so that no two adjacent countries or states have the same colour.

Floppy disks

The floppy disk is developed by IBM to use for starting computers with volatile memories. The first floppies in 1972 are 8 inches in diameter. These floppy drives and disks were further developed and used in some of the first PCs, replacing punched cards for transporting data. But the release of the 5 1/4-inch in 1978 means the floppy becomes widespread. The 3 1/2-inch floppy from Sony will become standard in 1984. Floppy disks remain an important removable storage for PCs until the arrival of flash memory sticks - after the millenium (though there were many less satisfactory removable storage technologies tried in the interim).

Local pioneers

Two New Zealand computer programmers, Gilbert Simpson and Peter Hoskins, develop a "4th generation language" called LINC that simplifies application development. They create a company Aoraki Corporation based in Christchurch to further develop their products. LINC is adopted and marketed by Burroughs Corporation, later Unisys. Aoraki later becomes Jade Software Corporation. Sir Gil Simpson was knighted in 2000 for his service to the computer industry and broader community.

The PC as a product

The first PCs, such as the Altair, are intended for hobbyists who know details of electronics and software. From 1979, complete computers - the Apple 2, the Commodore and the TRS-80 - are provided that can be used without such detailed knowledge. Software products are developed that make computers worthwhile in business and entertaining in the home. Visicalc is the start of spreadsheets and there are many computer games - PACMAN for example.

1980 64K RAM chips

The 1980s: PCs and networks

Commercial applications are still dominated by large, centralised computers, but smaller, cheaper computers are catching up fast. Although many networks are developed and the Internet is under way, the worlds of the mainframe and PC remain quite separate.

1980: The University of Auckland's Computer Science Department

The Computer Science department is founded in 1980, before which time computing and programming were been taught in other departments under the guidance of a governing committee. In 1981 the department is equipped with Zenith Z89 PCs, some of which are built by staff from Heathkits. Computing is also developed as a specialty in the Management Science and Information Systems department in the Faculty of Commerce.

The IBM PC

When the world's largest computer company begins manufacturing personal computers, the machines at last stop being seen as toys for geeks. The IBM PC, which arrives in 1981, is the smallest IBM computer ever. The successors of the IBM PC are now the most common computers sold, although they are manufactured by many suppliers, such as COMPAQ/ HP and Dell. IBM also manufactured and sold PCs but eventually sold their business to the Chinese company Lenovo.

Local area networks (LANs)

In 1981, our Computer Science department implements its own LAN to connect Z89s with printers. LANs have been growing in popularity with proprietary systems. An important step is the development of Ethernet in 1973 by Xerox's Palo Alto Research Centre (PARC). Later, LANs are provided to connect together small computers – for example Appletalk from Apple Computer.

Exporting knowhow

Progeni, founded in 1968 by Perce Harpham as Systems and Programs, is the first NZ company to export software. It sells systems to Dulux in Australia and a school timetabling program to the UK. In association with Wellington Polytechnic the company develops the Poly computer, a hardware and software package targeted at education. In 1983, it secures its first sales of the Poly in China.

1982: The 100 MHz clock

The Cray XMP in 1982, which has two vector processors, builds on the record-setting Cray-1 of 1976. Seymour Cray has left CDC and set up his own company Cray Research. Machines designed by Cray will lead the world of supercomputers for three decades.

Holes in the wall

In 1982 Automatic Teller Machines (ATMs) appear on New Zealand streets, followed by EFTPOS in 1984. Because of the Databank infrastructure, New Zealand is a world leader in adopting electronic banking.

1983 256K RAM chips

WIMPS and GUI

In 1984 Apple Macintosh introduces the Graphic User Interface (GUI). Modern, user-friendly computing arrives. The Apple Macintosh is particularly targeted as a "transportable" computer for students. Auckland University joins the Apple Macintosh Consortium. The Macintosh approach, dubbed WIMPS (for Windows, Icons, Menus, Pointers), makes the Mac the first widely available PC to use the modern interface. The WIMPS approach will later be adopted by Microsoft with their "Windows" software.

The virtual typewriter

Microsoft Word, which followed MacWord the year before, arrives in 1985. It is now the most widely used word-processing software. Aldus Page Maker desktop publishing also starts, coupled with superior quality laser and inkjet printers. Typesetting and the typewriter are obsolete overnight.

Compact Disks

The arrival of compact disks for data storage in 1985 allows pictures and the beginnings of video on computers, and lays the foundations of what are now called "multimedia" applications.

1986 1M RAM chips

Electronic mail

Email is universally adopted throughout our Computer Science department in 1986. It is initially used for document and message interchange within the department. It also provides access to international mail via the Internet, which is quite unreliable at first. The Internet protocols were developed in 1977, building on the Arpanet that was first operational in 1969.

The phone on the move

Telecom introduces the cellular phone in 1987. The arrival of microprocessors allows the encoding and decoding in the handset as well as the control of the network to find a path to a mobile user.

1989 4M RAM chips

The plague

The first bad computer virus, the Robert Morris worm, is identified in 1988. It is named after Robert Morris, Jr., a graduate student at Cornell who writes an experimental, selfreplicating, self-propagating program called a worm and injects it into the Internet. But the program replicates and reinfects machines at a much faster rate than he has anticipated. Machines crash at many locations, including universities, military sites, and medical research facilities. Morris is convicted of violating the Computer Fraud and Abuse Act.

Kiwi ingenuity

There are many small software companies founded in the Auckland area which enjoy success. Among them is Peace Software, founded in 1988. It became a leading provider of deregulated utility customer service software and an important local IT employer, particularly of university graduates. There are other companies that are successful in niche medical markets, such as Orion. Some of these local start-ups are eventually sold to larger overseas companies.

The 1990s: The global village

The 1990s sees major changes in the computer industry as mainframes are challenged by the superior performance and price of personal computers. By the end of the decade the fastest computer processors available are single-chip microprocessors as used in everyday PCs. The world wide web grows beyond all expectation and by 2000 there are over 7 million websites internationally.

The modern PC

The 80486 chip from Intel arrives in 1989. The 486, as it becomes known, provides the architecture from which modern PC chips such as Pentium, Athlon and their successors are developed.

A clearer window

Microsoft originally introduced Windows in 1985, but it is only with Windows 3 in 1990 and the arrival of more powerful PCs that Windows becomes a preferred alternative to DOS. Later versions, Windows 95, 98, ME, NT, XP, "7" and "8" will provide further refinements in operating system performance.

Welcome to the world

The PACRIM fibre cable, laid in 1992, connects New Zealand to the electronic world. Fibre optic cable communications, well advanced as early as 1985, develops steadily and spectacularly in capacity and technology for amplification. But the commissioning of PACRIM means submarine fibre cables replace geostationary satellites as the preferred method for global high-speed communication.

Weaving the web

The development of the MOSAIC browser in 1993 leads to the World Wide Web - WWW- (defined at CERN in Geneva in 1991, by Tim Berners-Lee and Robert Cailliau) becoming the dominant Internet application. In New Zealand, the link to the Internet in the US, established in 1989 through servers at Waikato University, lays the foundation for fast adoption of the technology.

Woody creates a buzz

Toy Story, the first fully computer-generated film in 1993, is a milestone in the field of Computer Graphics which uses computers to model and display realistically natural scenes. Computer graphics started in the 1950s with primitive displays that built pictures from straight lines. As computers become more powerful and screens improved to provide fine-resolution bit-mapped displays, the field of computer graphics expands. By the 1980s computer graphics are being widely used in advertising. New Zealand has been a strong participant with companies such as Animation Research Limited in Dunedin which developed Virtual Spectator for depicting America's Cup yachting. Expat New Zealander Andrew Adamson's animated film Shrek and Peter Jackson's trilogy The Lord of the Rings show that our workshops can foot it with the world's best.

1993 16M RAM chips

Home delivery

The first local provider of access to the Internet is ihug (for Internet Home Users Group) in 1994. Telecom NZ launches XTRA two years later. The widespread use of the Internet leads to the rise of "E-commerce" in the late 90s.

Java jive

The development of the JAVA programming language in 1995 makes it possible to provide programs to be executed remotely over the Internet. The nice features of the language result in it becoming an introductory standard language to use in universities worldwide (though it is being widely replaced by Python for this purpose). It is an Object-Oriented language which builds on the history of C++ in 1985 and goes back to Smalltalk in the early 1970s.

1996 64M RAM chips; Auckland Sky Tower completed

1996: Tower power

The Auckland Sky Tower, completed in 1996, is now the international Internet hub for New Zealand.

Computers are smarter

In 1997, an IBM computer and a special program called "Deep Blue" beats the world chess champion Garry Kasparov. Computer chess has been an ambition in Computer Science since very early days - Turing wrote one of the first chess programs. It is part of the field of Artificial Intelligence (AI) – writing programs that endow computers with attributes of human reasoning ability.

The final frontier

The Mars Pathfinder's Sojourner Rover, which rolls onto the surface of Mars on July 6, 1998, is the culmination of years of progress in the field of Robotics that commenced in the 1960s, when computers became inexpensive enough to be dedicated to controlling machines.

The heart of the matter

The University of Auckland's Engineering Science Department installs a supercomputer in 1998, the first here with multiple processors intended for massive numerical calculations. It is housed in the Biomedical Engineering Research Unit. Medicine has also seen impressive application of computers, particularly Computer Aided Tomography (CAT) scanning.

Unconventional but discrete

Our Centre for Discrete Mathematics and Theoretical Computer Science hosts the first international conference on Unconventional Models of Computation (UMC'98). This discusses approaches to calculation - quantum computing for example - which are different from and extend beyond the model proposed by Turing.

Programming as a profession

The establishment of a Software Engineering degree in 1999 marks the coming of age of the Computer Science Department. The department has always taught programming to its students but this degree, a joint venture between Computer Science and the Department of Electrical and Electronic Engineering, recognises the importance of programming as a profession.

Music for free

1999 is the year music was free, albeit briefly. The MP3 standard for compressed sound is a subset of MPEG and accepted earlier but with the arrival of software for playing MP3 files this becomes a popular way of encoding music. Napster introduces software for exchanging MP3 files but it is shut down by pressure from the music industry and is now replaced by systems which charge for single-song downloads.

Computing on the move

The first decade of the 21st century is one of contrasts for computing. The underlying technology shows signs of becoming mature, but computing applications have an increasing impact on our lives, both for work and at home.

2000: 1 GHz clock

Processor chips from Intel and AMD break the 1 GHz. barrier. But the increase in performance of single chip processors has slowed. The fastest processors during the decade run at less than 5 GHz. However, chip density continues to increase. Having multiple processors (called "cores") on the one chip is preferred to making processors faster and chips with 2 or 4 processors become common. Special graphics processors (GPUs) are included in PCs and electronic game players. GPUs, involving a parallel programming model, evolve to rival the power of general processors and are used for computation in some supercomputers.

2001 256Mbit RAM chips

Flat, but outstanding

Flat screen displays first appear in portable "lap top" computers. As the displays increase in size and decrease in cost they replace CRTs completely, including in desktop PCs. Many different technologies are used in flat displays including LCD, LED and even "electronic paper" used in some book readers such as Kindle where books are sold and downloaded digitally. Development of touch-sensitive flat screens leads to entirely new applications.

That's Flash!

"Flash" is a type of semiconductor memory developed by Dr Fijio Masuoka for Toshiba c. 1980. It is rugged and non-volatile – what is stored remains when the power is off. But it has disadvantages such as needing to be erased in blocks before being written. The first chips in 1988-9 are for specialised use but flash improves until it becomes the cutting-edge semi-conductor product. There are two types of flash - nor and nand. Nor is more suited for direct execution of programs. Nand is simpler and very fast for block writing so is a disk substitute, used for camera memories, memory sticks, and SSD (solid state disk) in mobile devices. With Samsung as the leading manufacturer, its increase in density has been phenomenal, starting with 1 Gbit chips in 2002 and up to 32 Gbits by the end of the decade. The USB memory stick completely replaces the floppy disk.

Home sweet home

The Computer Science Department occupies its new building. Built over space occupied by the AURA 2 accelerator, it is officially opened by the Vice Chancellor Dr John Hood on September 23, 2003. This timeline was originally designed to decorate the new building but was redesigned in 2013 after the entrance corridor was demolished.

Go Google That

Finding information on the "web" is facilitated by free search engines culminating with Google. This is so widely used that the term "Google" enters our language. On-line sites like Wikipedia provide massive stores of usually-reliable information. Access to Geographic data on-line becomes the norm with Google Earth and Google Maps being particularly important. Projects to scan and make available out-of-print books and newspapers provide easy access to historical data. Digital cameras are included in mobile phones, crude still photos to start with, but eventually with enough quality to stand-in for everyday cameras. At every news scene now, where there is a phone, there is video of the event. The use of video on the web arrives in a big way with YouTube. Advertisements and unknown entertainers become overnight sensations when videos go "viral".

The human genome is sequenced

The human genome, or DNA, comprises a sequence of 3 billion nucleotide bases of only 4 types, abbreviated as A, G, C, and T. The actual sequence of bases determines how the DNA acts so it is important that it be found. The task seems forbiddingly large but, after many years of effort by competing teams, using special machines to generate short sequences, and the storage, processing power and algorithms of Information Technology, it is finally declared that the human genome has been sequenced. This is an early success for the field of Bioinformatics – the application of computational methods in Biology, Biotechnology and Medicine. The University of Auckland commences degree programs in the area and Bioinformatics research flourishes.

KAREN wakes

Formed in 2005, REANNZ (Research and Education Advanced Network NZ Ltd.) owns and operates New Zealand's own advanced network (known to many as KAREN for Kiwi Advanced Research and Education Network.) REANNZ is a Crown-owned company providing initially a 10 Gbit/ sec network connecting New Zealand's universities, polytechnics, Crown research institutes, schools, libraries, museums and archives, and out to the rest of the world.

2007 1 Gbit RAM chips

Electronic Commerce comes of age

After a slump at the beginning of the decade caused by the bursting of the overenthusiastic "Dotcom bubble," electronic commerce matures as a feature of everyday life. Amazon and others market books and consumer products. EBay in the US and Trademe in New Zealand introduce world-wide on-line auctions. Apples iTunes and others sell audio and video on-line. Secure banking from home becomes the norm. The usage of the internet continues to grow exponentially during the decade.

1 Gbit RAM chips

Although 1 Gbit RAM chips were announced on schedule in 2007 they have lost their glamour as the leading edge of circuit technology. Intermediate density chips (512 Gbit), the stacking of multiple lower density chips in a package and slow adoption of 64-bit software, all delay the widespread use of denser chips. By the end of the decade 2 Gbit chips are in use and 4 Gbit has been demonstrated. However, circuit technology action shifts to the larger market of flash memory.

The Terabyte disk

Density improvements with hard disks continue at historic rates. The Terabyte barrier is crossed in 2008 - hard disks have improved a billion-fold since their introduction. Very small hard disks, down to 2cm diameter, are developed for entertainment devices but these are replaced by flash memory. Most data storage is on 3.5" disks – laptops use 2.5" disks though these start to be challenged by flash memory. These dense disks go beyond the needs of the individual user and lead to the creation of publically available centres for back-up storage, which need the terms "Petabytes" (quadrillions) and "Exabytes" (quintillions) to describe their capacity.

Common Sensing

Coupled with ubiquitous networked microprocessors and databases, technologies that allow sensing of data carried by people and other objects change traditional business models. The spread of bar codes and scanners in supermarkets is a noticeable change for consumers, leading in some supermarkets to self-check-out. The same technology is applied in many areas, leading to self-checkin at Air New Zealand, and others. Other technologies now having great influence on the end-user, include smart cards (credit cards that carry digital data and a processor), RFID tags (that return data by radio when stimulated and powered themselves by radio) and QR (Quick Response) tags.

You can take it with you

Mobile phones gradually take on the functions of general computers. They act as Personal Digital Assistants with wireless access – of particular note is email via the Blackberry. Finally, smart phones such as the iPhone from Apple provide full access to web services. The initial method of connecting mobile devices to the internet is by local wireless networks, later augmented by cellular telephone networks. Satellite-based Global Positioning System (GPS) is linked with maps for improved navigation by companies such as Navman in Auckland, eventually available in every advanced automobile and cell phone. The lack of a keyboard and mouse is overcome by the introduction of touch-sensitive screens and recognition of user-gestures, particularly with the IOS of Apple iPhone and iPad, then by their competitors using the Android OS.

Social Media

Telephony over the web, led by Skype, at last makes the videophone a reality to everyone. Individual and group blogs offer competition to conventional media which are forced to adapt, though many struggle to find new business models. Photography becomes dominantly digital. Powerful processor chips in cameras give quality (e.g. anti shake) above what was available in the past. With all personal data and images now digitised, companies are created to allow personal information to be shared with friends – the most successful and influential being Facebook – its usage grows at an enormous rate and radically alters the way people interact. Sites are created to assist in the on-line publication of blogs (web logs) which range from diaries to specialist and political comment. The internet becomes a political tool with disclosure of archives of secret files and the ability to bypass conventional main-stream communication media.

The future is now

It is too early, in 2013, to know what the main developments of this decade have been so far, much less to predict the shape of things to come, though social media, such as Twitter, will likely continue to be a major theme.

eScience

NeSI (NZ eScience Infrastructure) provides High Performance Computing facilities, comprising tens of thousands of processors, to New Zealand. It supports researchers with three state-of-the-art facilities including one at The University of Auckland. NeSI operates infrastructure and software that support eScience – collaborative scientific on-line communities that share large-scale IT resources. In New Zealand, the first public eScience facility, NZ Genomics Ltd (a gene sequencing and bioinformatics infrastructure provider founded in 2010), processes mountains of data using massive computational support and software developed by Biomatters, an Auckland-based New Zealand company. NeSI works alongside REANNZ (the advanced research network) and NZGL as part of our national eScience system.

2011: 2011 Rugby World Cup

2011 Rugby World Cup

"The cloud" is built on Queen's wharf to entertain visitors. New Zealand beats France at Auckland in a nail-biting 8-7 final.

Farming out the work

Mainframes are still made, now with up to 64 processors directly sharing memory, though there are many processor cores per chip. Further power is provided by servers – multiple processor chips packaged on multiple circuit boards (sometimes termed blades). The largest multi-Petaflop supercomputers are effectively closely-connected sets of servers. To service internet commerce, thousands of processor servers and their disk stores are gathered together to form "server farms". A movement starts to again provide computing as a utility service. Because the user does not know where the processors and data are located physically, this utility becomes termed the cloud.