![](/wp-content/uploads/2024/04/041824-41-History-Technology.jpg)
Charles Babbage is credited with building the first mechanical computer in the 1820s.
![](/wp-content/uploads/2024/04/ShambhaviRoy100.jpg)
By
Assistant Software Engineer
![](/wp-content/uploads/2024/04/ClintonDaniel100.jpg)
By
Associate Professor of Instruction
![](/wp-content/uploads/2024/04/ManishAgrawal100.jpg)
By
Professor of Information Systems
Introduction
How do you spend your free time? How do you communicate with friends? How do you complete schoolwork? How do you find information you need? How do you buy things you like? What are some of your most prized possessions? What do some of the business celebrities you admire do? On completing school, what are some of the popular high paying jobs in your town that you might be interested in doing? In the first half of the 21st century, Digital Information Technologies (DIT), or Information Technologies (IT) for short, are likely to be part of the answer for all these questions. Slowly but steadily, DIT has dramatically influenced our lives to the point where it is now necessary for all students to know the essentials of using DIT to improve their lives. This book, the accompanying materials, and the DIT course aim to do just that. At the end of this class, you should be able to explain to friends and family how computers work. You should also feel comfortable in using computers to improve your own effectiveness.
Information technologies are powerful because people crave information, and IT makes it easy for people to get this information. The world is teeming with information about who we are, where we live, and what we do to sustain ourselves. Until very recently, before the advent of computers in approximately 1990, this information was written on paper, published in newspapers or magazines, and stored in libraries and stacks of files in offices. In your own personal lives, you and your parents probably cherish family photo albums with printed photographs. Before computers, there was no easy way for people to share information with each other. Businesses did not have easy access to data about their own customers to understand larger patterns and take corrective actions.
Beginning with the 1990s, computers became popular and starting in the 21st century, computers got connected to each other through the Internet, and the field of Information Technology (IT) was born. IT is the use of computers and networking technologies to store, process, and retrieve information. IT is now so important to organizations that IT investments, which include computers, networks, software, and employees are now one of the largest expenses for most organizations.
One result of this transformation of organizations is that regardless of what type of work you plan to do when you complete school, you will be expected to be as comfortable with using IT effectively as you are with using your first language effectively. Doctors are expected to read and write electronic medical records as they interact with patients, CEOs and business owners are expected to interpret reports generated by IT, and government staff are expected to work with citizen records as they respond to user queries. In turn, some of the most desirable jobs in the economy are emerging from this infusion of technologyk.
![](/wp-content/uploads/2024/04/041924-01-History-Technology.jpg)
The Department of Motor Vehicles (DMV) is a great example of the impact of IT in our lives. See Figure 1. Today, when you buy a car and go to the Department of Motor Vehicles (DMV) to get it registered, the clerk at the DMV office will record all the information about you and your car (make, model, VIN number) in a computer to establish your ownership of the car.
![](/wp-content/uploads/2024/04/041924-02-History-Technology.jpg)
Your data will be cleaned, processed, and moved across several computers and finally stored in a data center. See Figure 2. This data center will also hold similar data about all car owners across the state.
How does IT help you and the DMV? Why should your state government invest millions of dollars each year to computerize the DMV? In 2020, there were 276 million vehicles registered in the US.1 The DMV needs to store information about all these vehicles and drivers safely. The DMV must also make the information available quickly when necessary. For example, when you buy a car, the DMV must make sure that the seller is the owner of the car and has the right to sell the car to you. When police officers stop drivers on the highway for speeding, it is very useful for officers to be able to check the driver鈥檚 license and car registration information. This keeps the roads safe for other drivers. Less interestingly, computerization also makes it easy for the DMV to send drivers automated reminders to make annual registration payments. Can you imagine doing all this across all 50 states in the nation without IT?
To pull off this feat of information technology, the DMV uses lots of expensive computers that are connected through complex networks and software programs that interact with the information stored in the system. They hire many trained people to maintain these data centers, upgrade hardware and software regularly, and maximize the efficiency of the DMV. These employees are very well compensated because their skills are valuable. The DIT course sets you on a path to acquire these skills so you can work in these roles across government and industry.
However, the DMV鈥檚 IT system (hardware, software, IT personnel, and processes) will be very different from that of the Department of Homeland Security (DHS) or any other organization for that matter. Even within organizations, IT systems evolve constantly based on how we want to gather information, what bits of information we want to gather, and what information has lost meaning over time. Some IT systems need to be highly secure, some are spread across the world, and some may be small enough to need just one employee and a desktop.
![](/wp-content/uploads/2024/04/041924-03-History-Technology.jpg)
Until around 2015, there was a distinction between personal electronic devices such as cell phones and office IT systems like PCs. But now with powerful smartphones and cheap mobile Internet, this distinction is disappearing. Employees expect business applications to be as user-friendly as consumer applications and consumer services to be as secure as business services. Employees can be immediately responsive round-the-clock from work and home. Employees of the gig economy (e.g. Uber drivers, Door Dash delivery people, Amazon/FedEx drivers) take it one step further and use personal phones as their work devices to process orders. In this example, their personal mobile devices are part of the company鈥檚 IT system. See Figure 3. The Covid pandemic and the work-from-home trend has further reduced the distinction between personal and professional IT.
How Information Technology Began
This section provides a quick tour through the history of how we have reached the current state of 鈥渋nformation technology everywhere.鈥 We show how innovative teams have responded to human needs and commercial incentives to create the technologies we take for granted today. If you find this interesting, we hope you will read more about these individuals and technologies on the Internet.
![](/wp-content/uploads/2024/04/041924-04-History-Technology.jpg)
The history of information technology includes the history of computer hardware and software. Charles Babbage is credited with building the first mechanical computer in the 1820s. Over a hundred years later in 1946, a team at the University of Pennsylvania publicly reported the first programmable, general-purpose computer. It was called the Electronic Numerical Integrator and Computer (ENIAC). The ENIAC weighed 30 tons and took up 1800 square feet of space. See Figure 4. It supported most hardware and software components that modern programmers recognize. The ENIAC could read inputted data, hold information in memory, step through programming instructions, create and call sub-programs, loop through code, and print output.2,3
The ENIAC didn鈥檛 have a lot of modern peripherals we take for granted now, such as monitors, keyboards, and printers. To use the ENIAC, programmers had to write instructions (code) into punched paper cards (cards with holes that could be read by computers). It took weeks of code-writing and debugging before the computer could do anything useful. The US Army funded the development of the ENIAC to compute firing tables during WWII.4 Firing tables provide recommendations to gun operators on the optimal specifications to hit a target, taking into account terrain conditions, weapon wear-and-tear, ammunition type, etc.5 While the ENIAC was created to serve a specific military purpose, its general computing capabilities captured the imagination of the public.
The ENIAC was a computer like any modern computer, but it did not use software as we understand it today. Every instruction for every task was hard coded by experts. If the task was to be repeated, the instructions were written again on punch cards, which could take days. A lot of these instructions involved tasks such as reading data and writing outputs, which are common to all computer programs.
Eventually, these shared tasks were aggregated into computer programs called operating systems. The Operating System (OS) is the brain of the computer and controls all the parts of a computer. A computer mouse, keyboard, display monitor, motherboards, and storage drives are all components of a computer, and they act together as one computer only when the operating system recognizes them and orchestrates a symphony of coordinated actions to complete the task you tell the computer to perform. When you move your mouse, tab your screen, type on your keyboard, or make a phone call, it is the operating system that recognizes the action and tells the components how to act to bring about the desired outcome.
Most complex entities offer a 鈥渇ront office鈥 that makes it simple for users to request and offer services. At most schools for example, the front office is where students report absences, get their schedules, teachers report grades, request supplies, and parents make inquiries. The front office staff are experts in handling these requests and orchestrate the necessary actions to complete these requests. The front office staff also ensure that any administrative requirements, such as student privacy, are protected as these actions are performed. Operating systems perform the same role for computers. They receive inputs from users and applications and coordinate all necessary actions until the appropriate output is presented to users.
What Does an Operating System Do? – Operating systems are to computers what front offices are to schools.
![](/wp-content/uploads/2024/04/041924-05-History-Technology.jpg)
Operating systems evolved rapidly in the 1960s and 70s. In 1971, AT&T, the dominant phone company at the time, built an operating system called Unix. See Figure 5. Unix and its variants were freely distributed across the world in the 1970s and 1980s and some of the world鈥檚 best computer scientists and engineers volunteered their contributions to make Unix extremely capable. These experts were guided by the principle, 鈥minimum number of keystrokes [to] achieve the maximum effort.鈥6 Because of their powerful capabilities and low to no costs, Unix and its variants including the popular Linux operating system now power most computers around the world today, including all major smartphones. Windows is another popular operating system, used extensively on desktops and in data centers.
Ken Thompson used his spare time at AT&T Bell Labs to write nerdy computer games and developed software to support his game, Space Travel. Because the software only supported one user (Ken Thompson), the software became famous at the lab as Thompson鈥檚 Un-multiplexed Information and Computing Service, or Unics, and later was abbreviated as Unix. Eventually Unix acquired capabilities for multiple users to share the resources of a central processor, so the Unix name is not really representative of the operating system鈥檚 single-user limitations anymore. But names stick, and the Unix name remains popular. The first edition of Unix was just 4,200 lines of code indicating how powerful good computer code can be. Even in these earliest editions, Unix included games such as Blackjack, and this contributed to its popularity.
The Origin of Unix – Unix began as was one developer鈥檚 attempt to support games on office machines.7
A powerful economic force also contributed to Unix鈥檚 widespread adoption. While AT&T funded the development of Unix, 15 years earlier, in 1956, AT&T had reached an agreement with the federal government that gave it monopoly status on long-distance telephony. In exchange, AT&T agreed not to sell any product that was not directly related to long distance telephony. Eventually, AT&T shared the source code to Unix with multiple organizations, and they released their adaptations to the world. One of the most popular of these adaptations was developed at UC Berkeley and was called Berkeley Systems Distribution (BSD) Unix. The licensing for BSD Unix allowed adopters to make their own modifications without releasing them back to the community. This was very useful to commercial vendors. Among the most popular current commercial releases tracing their lineage to BSD are the operating systems on all Apple products, including MacOS on laptops and iOS on smartphones. The popularity of Unix is the result of technology excellence and economic incentives.
The Era of Personal Computers
![](/wp-content/uploads/2024/04/041924-06-History-Technology.jpg)
Until the early 1980s, computers were too expensive for personal use. As the cost to manufacture computers components came down, IBM saw an opportunity to make small, self-contained personal computers (PCs) that had their own Central Processing Units (CPUs). Since Unix was designed for giant centralized machines and dumb terminals, there was a need for an operating system that could run on these personal devices. IBM partnered with Microsoft to create an operating system for personal computers. This operating system was called Disk Operating System (DOS). Although DOS wasn鈥檛 easy to use (users still needed to type commands manually on a line), the idea of owning a computer caught on, and the IBM PC started the PC revolution by becoming the world鈥檚 first popular personal computer.
The IBM personal computer was invented in Florida.8 Boca Raton to be precise. Since 1967, IBM had operated a unit in Boca Raton to develop, build, and sell inexpensive computers. A team at this unit built the inexpensive personal computer (PC) by creating an open design where components from multiple vendors could interoperate. The IBM PC used processors from Intel, operating systems from Microsoft, and components from several other vendors. Competition among these vendors brought costs down, while the popularity of personal computers helped many of these suppliers become large companies themselves. As of 2022, Microsoft (valued at $2Tr) and Intel (valued at $130bn) are worth more than IBM itself ($118bn). As of 2022, IBM, Intel, and Microsoft employ over 600,000 people.
The IBM PC and Florida
Along with inexpensive hardware, in the mid-1980s came user-friendly software. In 1985, Microsoft launched Microsoft Windows, an easy-to-use Graphical User Interface (GUI) based operating system. Microsoft also released Excel for financial calculations in 1985 and Word for text editing in 1989. Doug Klunder (Microsoft鈥檚 first college hire9) led the development of Excel and Charles Simonyi (the world鈥檚 first repeat space tourist10) led the development of Word. Both programs leveraged the special capabilities of graphical user interfaces and propelled personal computers to widespread adoption in businesses. Computers also turned into convenient home devices for everyday users to write letters, manage personal finances, communicate with friends and family, create music, and watch entertainment shows. Between 250 million and 350 million personal computers sell each year to meet this demand.
The Era of Networked Computers and the Internet
Communication is a very fundamental human activity and information exchange has been one of the most popular uses of computers. Computer engineers recognized this need for information exchange early on and developed technologies for computers to talk to each other. The initial networks were limited in scope, connecting computers located within an office, and allowing users within an office to send emails to each other and share expensive resources such as printers.
Communication is a very fundamental human activity and information exchange has been one of the most popular uses of computers. Computer engineers recognized this need for information exchange early on and developed technologies for computers to talk to each other. The initial networks were limited in scope, connecting computers located within an office, and allowing users within an office to send emails to each other and share expensive resources such as printers.
One of the most famous demonstrations of technology happened on December 9, 1968. Douglas Engelbart delivered a 90-minute demonstration of essentially all the personal networked computer technology we use today. The demo included networking, graphical user interfaces, web-like pages (hypertext), images, the computer mouse, video conferencing, and word processing. In about 20 years, the technology was available in stores. For its far-reaching impact, the technology industry gave it the name 鈥mother of all demos.鈥
Mother of All Demos11
As networks grew, network effects emerged. To understand the network effect, imagine a village with just two telephones connected to each other by a wire. The telephones are not very useful since they only connect two people in the village. Conversations with all other users happen outside this network. However, as more people in the village connect to the network, every telephone in the network becomes increasingly useful. The same telephone allows users to connect to more people in the village. The free increase in benefit to the community as more members join a network is called the network effect.
The network effect generated powerful incentives within the industry to network computers. By 1981, the core computer networking technology we use today was specified. Since that time, the development of computers is closely associated with the development of computer networks, Internet, and the World Wide Web.
The Internet and the World Wide Web
Since the beginning of the 21st century, computer networks have become perhaps more robust and globally available than water and electricity. Users and businesses have taken advantage of this networking capability to share information and do business with people around the world. The global network of computers that share information with each other is called the Internet. Information on the Internet can be linked to any other information on the Internet and these links can be considered as a web of information. Therefore, the information shared on the Internet is called the World Wide Web.12
Information traveling on the Internet is like cars traveling on a highway system. The Internet is a vast network connecting many smaller local networks, in the same way that the highway network connects all connected roads. On the road, you can start from any point and drive to any other point using the network of roads, as long as the roads are connected. The World Wide Web enables the same capability for information.
Until about 1870, the United States was about as economically advanced as the rest of the world. But by 1920, the United States had advanced decisively in comparison. Much of this is attributed to the construction of five networks鈥攚ater, sewage, electricity, roads, and telephones. By the end of the 20st century, the information network was added to this list of networks contributing to American prosperity.
Networks and Development
The Internet is built by connecting two types of networks鈥攕mall networks within buildings and large
networks that connect these small networks. The small networks connecting workers inside an office
building or a school are called Local Area Networks (LANs). The network at your school or home is an
example of a LAN. LANs help the computers in an office share files, emails, printers, and the Internet
connection with each other. The networks that connect these small networks to each other are called
Wide Area Networks (WANs). WANs are typically large networks spread across a wide geographic area
such as a state or country and are used to connect the LANs within corporate and satellite offices.
WANs are typically operated by Internet providers such as Verizon, Frontier, and Spectrum and users
pay subscription fees to access WANs.13
The Mobile Revolution
![](/wp-content/uploads/2024/04/042024-01-History-Technology.jpg)
Computing technologies have evolved rapidly and the technology industry has succeeded in shrinking computers to the size of mobile phones. Soon after the PC became a household device, developments in related technologies like storage, battery capacity, screens, materials, and networking fueled the mobile revolution. As a result, the traditional phone is now mostly replaced by powerful computers called smart phones, which users around the world carry in their pockets and purses. While Windows was the dominant operating system for the personal computer era, two operating systems are dominant in the mobile era鈥擜pple鈥檚 iOS and Google鈥檚 Android. Both these operating systems trace their lineage to Unix. While the iPhone鈥檚 iOS is a version of Unix, Android is based on Linux, a Unix-compatible operating system initially created in 1991 by Linus Torvalds when he was a student at the University of Helsinki in Finland.
The availability of computers gave us the ability to design more powerful computers. The virtuous cycle was the basis of Moore鈥檚 law named after Gordon Moore, one of the founders of Intel, a computer chip pioneer. Gordon Moore noted that the number of transistors on a microchip doubled every two years. The transistor is the core component of a computer chip, and is a tiny electronic device used to store and process information. Its size is measured in nanometers (one millionth of the width of a human hair). To understand the kind of progress we have made, consider the fact that a cutting-edge computer microchip in 1970 had about 2000 transistors in it, but the latest Apple M1 computer chip, released in 2020, has 114 billion transistors.
Moore’s Law
Today, even street vendors with modest incomes in developing countries around the world own mobile phones; and people living in distant countries can be as accessible as your next-door neighbors, all thanks to the availability of free audio and video phone apps. You can use your phone to do office work while being entertained, whether at home or waiting in a line at a grocery store.
Women in Technology
![](/wp-content/uploads/2024/04/042024-02-History-Technology.jpg)
Did you know that the person often regarded as the world鈥檚 first programmer was a woman, Ada Lovelace? Early on, women didn鈥檛 get sufficient credit for their work. The dedicated team of women programmers who worked on the ENIAC received recognition for their work only in the 1980s.
Women were the original 鈥渃omputers,鈥 doing complex math problems by hand for the military before the machine that took their name replaced them.鈥uring the 1940s and 50s, women remained the dominant sex in programming, and in 1967 Cosmopolitan magazine published 鈥淭he Computer Girls,鈥 an article encouraging women into programming. 鈥淚t鈥檚 just like planning a dinner,鈥 explained computing pioneer Grace Hopper. 鈥淵ou have to plan ahead and schedule everything so that it鈥檚 ready when you need it. Programming requires patience and the ability to handle detail. Women are 鈥榥aturals鈥 at computer programming.鈥
鈥擟aroline Criado Perez, “Invisible Women”, 2019
However, the moment it became clear that there was money to be made in computers, the moment it became clear that programmers needed to be brilliant, men ended up replacing women as programmers. Women faced significant hurdles in getting hired as programmers. Clearly, women had the programming skills since they were already doing the job. However, we, as a society, have a brilliance bias14鈥攚e rarely see women as brilliant. Rather than trying to figure out an applicant鈥檚 suitability for a job, tech companies stereotyped male characteristics as brilliant鈥攁 nerdy attitude, unkempt hair and face, staying up all night to program, loitering on programming websites that often also have content that women may find offensive. Hiring managers failed to account for the fact that a girl programmer may look different and even express her love for programming in a different way. For example, the tech-hiring platform Gild combed through applicants鈥 social data to assess their suitability.
According to Gild鈥檚 data, frequenting a particular Japanese manga site is a 鈥渟olid predictor of strong coding.鈥 Programmers who visit this site therefore receive higher scores. Which all sounds very exciting, but as O鈥橬eil points out, awarding marks for this rings immediate alarm bells for anyone who cares about diversity. Women, who as we have seen do 75% of the world鈥檚 unpaid care work, may not have the spare leisure time to spend hours chatting about manga online. O鈥橬eil also points out that 鈥渋f, like most of techdom, that manga site is dominated by males and has a sexist tone, a good number of the women in the industry will probably avoid it.鈥
Caroline Criado Perez, “Invisible Women”, 2019
But the bias against women doesn鈥檛 end at hiring.
More than 40% of women leave tech companies after ten years compared to 17% of men. A report by the Center for Talent Innovation found that women didn鈥檛 leave for family reasons or because they didn鈥檛 enjoy the work. They left because of 鈥渨orkplace conditions,鈥 鈥渦ndermining behavior from managers,鈥 and 鈥渁 sense of feeling stalled in one鈥檚 career.鈥 A feature for the Los Angeles Times similarly found that women left because they were repeatedly passed up for promotion and had their projects dismissed. Does this sound like a meritocracy? Or does it look more like institutionalized bias?
Caroline Criado Perez, “Invisible Women”, 2019
, from , published by , free and open access as part of project (2023).