Fundamentals of Information Systems
A Brief History of Computing
The Difference Engine – Charles Babbage (1791-1871). This machine could perform mathematical calculations with a precision of up to 31 digits. It consisted of gears.
Alan Mathison Turing – he proposed a finite state programmable computer. During World War II Turing worked on cryptography techniques for the Ministry of Defence.
WWII helped to advance computing, particularly with research from Howard Aiken and Grace Hopper. They produced the Harvard Mark I in 1944. Grace Hopper is credited with discovering the first computer bug. A malfunction was said to have been caused by a moth.
Until 1964, companies such as IBM produced unique machines, with proprietary operating systems and application software. The computers were bespoke to their requirements. IBM wanted a scaleable architecture so it would be easier to produce and implement. They came up with System/360 so the same operating system could be used with scaleable architecture. Mainframe computers were designed that were fully compatible with 360 software.
Intel mass produced the first single chip 4-Bit CPU the 4004 in 1971 and was sold off to be used in a calculator.
In 1974 the Altair 8800 went on sale. The computer did not cause much of a stir outside the very small computing fraternity. The Altair was programmed, instruction by instruction, by means of input switches and output by means of blinking LED’s. In 1976, Apple I was born. 1977 saw the production of Apple II, the first ‘cheap’ personal computer. IBM in response mobilised their engineers, developers and marketing personnel to develop their own PC and so worked with Microsoft to create an operating system. The first IBM PC was sold in 1981.
Software for the New Generation
In the early 80s, the use of computer applications grew. Lotus 1-2-3 spreadsheet made the PC saleable to businesses. This sort of application was known as the ‘killer-application’.
Computing for the Masses
In 1984 the Apple Macintosh was launched and was a compact alternative to IBM’. Companies making PCs using IBM’s specification and running MS-DOS were cutting drastically into IBM’s share. IBM, unlike Apple, had not licensed out its architecture specifications and ‘clone’ vendors could buy motherboards, Intel processors and peripherals cheaply and quickly produce PCs for the market. This was a key factor in the virtual collapse of the PC division of IBM. Other Intel processors available to clone vendors were the 80286, 386, 486 and Pentium.
The Internet began as a US military command and control computer network called the ARPANET (Advanced Research Projects Agency NET), in the late ‘60’s. During the ‘70’s, many of the ARPANET users were academic and scientific researchers. In 1984, ‘NSFNET’ (National Science Foundation NET), a faster and larger network into which ARPANET was connected, went online. In 1990 the ANSNET (Advanced Networks and Services NET) took over NSFNET. Prior to this, TCP/IP (Transfer Connect Protocol/Internet Protocol) became the de facto standard protocol with which the networks communicated. Sometime in the mid ‘80’s the term Internet was being used exclusively to refer to this global network.
Tim Berners-Lee introduced the World Wide Web in 1989 with the creation of the hyperlink, HTTP and HTML
WAN, LAN, MAN, Desk area network – includes webcam, speakers, or wireless networking. Wifi, Bluetooth, controller area network (embedded microprocessors, say in cars), personal area network, storage area networks with quick access with a lot of people using it, needing specialist fast spinning hard disks.
Ethernet – This is a broadcast network. Information is sent to everyone on the network. Ethernet is a protocol so computers can talk to each other so they can all listen and broadcast. Ethernet is also called CSMA/CD – Carrier Sense multiple access/condition detect.
Token Ring – It uses a collision detection system on a token ring system. It is a token passing ring. The token is a packet of data that can get lost. There is a master to check if the token exists and we can tell how long the token should take to get to the next PC by knowing how long the cable is. If there is no detection of the token, a new one is created.
Ethernet is the dominant one in use as it is faster then token ring.
Von-Neumann designed this computer architecture.
Memory is volatile but the hard drive is not. Memory is very fast. There is static and dynamic RAM. Hard drives are much slower. The operating system is loaded onto the memory. Virtual memory helps when the memory is used up.
It has the ALU for arthimetic processes. It contains registers which store data for onr CPU cycle. There is a program counter. The fetch-execute cycle is:
Fetch => program instruction => memory => CPU instruction register.
Execute => instruction is decoded and obeyed. It is a loop process.