L1 Introduction To Information Technology
L1 Introduction To Information Technology
TECHNOLOGY &
BUSINESS
Introduction to Information Technology
LESSON 1
Content
• Data are facts or figures. They are bits of information, but not information itself.
• When data are processed, interpreted, organized, structured or presented to make them
meaningful or useful, they are called information.
• Example, a list of dates is data. This data is meaningless without the information that
makes the dates relevant, such as dates of holiday.
• Data and information are intricately tied together, whether one is recognizing them as two
separate words or using them interchangeably, as is common today.
• Whether they are used interchangeably depends somewhat on the usage of data in its
context and grammar.
What is Data & Information?
• The number of visitors to a website by country is an example of data. Finding out that traffic
from the U.S. is increasing while that from Australia is decreasing is meaningful information.
• Often data is required to back up a claim or conclusion (information) derived or deduced from
it.
• Example, before a drug is approved by the FDA, the manufacturer must conduct clinical trials
and present a lot of data to demonstrate that the drug is safe.
Misleading Data
• When this leads to erroneous conclusions, it is said that the data are misleading. This is
often a result of incomplete data or a lack of context.
• Example, your investment in a mutual fund may be up by 5% and you may conclude that
the fund managers are doing a great job. However, this could be misleading if the major
stock market indices are up by 12%. In this case, the fund has underperformed the market
significantly.
History of Computer Technology
• The history of computers goes back over 200 years. At first theorized by mathematicians
and entrepreneurs, during the 19th century mechanical calculating machines were
designed and built to solve the increasingly complex number-crunching challenges.
• The advancement of technology enabled ever more-complex computers by the early 20th
century, and computers became larger and more powerful.
• Today, computers are almost unrecognizable from designs of the 19th century, from the
huge computers of the 20th century that occupied whole rooms, such as the Electronic
Numerical Integrator and Calculator.
History of Computer Technology – 19th Century
• 1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses
punched wooden cards to automatically weave fabric designs. Early computers would use
similar punch cards.
• 1848: Ada Lovelace, an English mathematician and the daughter of poet Lord Byron,
writes the world's first computer program. According to Anna Siffert, a professor of
theoretical mathematics at the University of Münster in Germany, Lovelace writes the first
program while translating a paper on Babbage's Analytical Engine from French into
English.
History of Computer Technology – 19th Century
• 1853: Swedish inventor Per Georg Scheutz and his son Edvard design the world's first
printing calculator. The machine is significant for being the first to "compute tabular
differences and print the results," according to Uta C. Merzbach's book, "Georg Scheutz
and the First Printing Calculator" (Smithsonian Institution Press, 1977).
• 1890: Herman Hollerith designs a punch-card system to help calculate the 1890 U.S.
Census. The machine, saves the government several years of calculations, and the U.S.
taxpayer approximately $5 million, according to Columbia University Hollerith later
establishes a company that will eventually become International Business Machines
Corporation (IBM).
History of Computer Technology – 19th Century
Famed mathematician Charles Babbage designed a Victorian-era computer called the Analytical Engine. This
is a portion of the mill with a printing mechanism. (Image credit: Getty / Science & Society Picture Library)
List Down 5 Key Milestones in History of Computer Technology
during 19th Century.
Activity 1
History of Computer Technology – Early 20th Century
• 1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and
builds the Differential Analyzer, the first large-scale automatic general-purpose
mechanical analog computer, according to Stanford University.
• 1936: Alan Turing, a British scientist and mathematician, presents the principle of a
universal machine, later called the Turing machine, in a paper called "On Computable
Numbers. The central concept of the modern computer is based on his ideas. Turing is
later involved in the development of the Turing-Welchman Bombe, an electro-mechanical
device designed to decipher Nazi codes during World War II, according to the UK's
National Museum of Computing.
• 1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State
University, submits a grant proposal to build the first electric-only computer, without using
gears, cams, belts or shafts.
History of Computer Technology – Early 20th Century
• 1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto,
California. The pair decide the name of their new company by the toss of a coin, and
Hewlett-Packard's first headquarters are in Packard's garage.
• 1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's
earliest digital computer. The machine was destroyed during a bombing raid on Berlin
during World War II. Zuse fled the German capital after the defeat of Nazi Germany and
later released the world's first commercial digital computer, the Z4, in 1950.
• 1941: Atanasoff and his graduate student, Clifford Berry, design the first digital electronic
computer in the U.S., called the Atanasoff-Berry Computer (ABC). This marks the first
time a computer can store information on its main memory, and can perform one
operation every 15 seconds.
History of Computer Technology – Early 20th Century
• 1945: Two professors at the University of Pennsylvania, John Mauchly and J. Presper
Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC). The
machine is the first "automatic, general-purpose, electronic, decimal, digital computer“.
• 1946: Mauchly and Presper leave the University of Pennsylvania and receive funding
from the Census Bureau to build the UNIVAC, the first commercial computer for business
and government applications.
• 1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the
transistor. They discover how to make an electric switch with solid materials and without
the need for a vacuum.
History of Computer Technology – Early 20th Century
• 1949: A team at the University of Cambridge develops the Electronic Delay Storage
Automatic Calculator (EDSAC), "the first practical stored-program computer’’.
• "EDSAC ran its first program in May 1949 when it calculated a table of squares and a list
of prime numbers”.
• In November 1949, scientists with the Council of Scientific and Industrial Research
(CSIR), now called CSIRO, build Australia's first digital computer called the Council for
Scientific and Industrial Research Automatic Computer (CSIRAC).
The newly renovated garage where in 1939 Bill Hewlett and Dave Packard started their business, Hewlett
Packard, in Palo Alto, California. (Image credit: Getty / David Paul Morris)
History of Computer Technology – Early 20th Century
Computer operators program the ENIAC, the first automatic, general-purpose, electronic, decimal, digital
computer, by plugging and unplugging cables and adjusting switches (Image credit: Getty / Historical)
History of Computer Technology – Late 20th Century
• 1953: Grace Hopper develops the first computer language, which eventually becomes
known as COBOL, which stands for Common, Business-Oriented Language according to
the National Museum of American History.
• Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal
of Freedom citation. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson
Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea
during the war.
• 1954: John Backus and his team of programmers at IBM publish a paper describing their
newly created FORTRAN programming language, an acronym for Formula Translation,
according to MIT.
History of Computer Technology – Late 20th Century
• 1977: The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak
present the Apple II computer at the Faire, which includes color graphics and features an
audio cassette drive for storage.
• 1981: "Acorn," IBM's first personal computer, is released onto the market at a price point
of $1,565, according to IBM. Acorn uses the MS-DOS operating system from Windows.
Optional features include a display, printer, two diskette drives, extra memory, a game
adapter and more.
History of Computer Technology – Late 20th Century
• 1983: The Apple Lisa, standing for "Local Integrated Software Architecture" but also the
name of Steve Jobs' daughter, according to the National Museum of American History
(NMAH), is the first personal computer to feature a GUI. The machine also includes a
drop-down menu and icons. Also the same year, the Gavilan SC is released and is the
first portable computer with a flip-form design and the very first to be sold as a "laptop."
• 1984: The Apple Macintosh is announced to the world during a Superbowl advertisement.
The Macintosh is launched with a retail price of $2,500, according to the NMAH.
• 1985: As a response to the Apple Lisa's GUI, Microsoft releases Windows in November
1985, the Guardian reported. Meanwhile, Commodore announces the Amiga 1000.
History of Computer Technology – Late 20th Century
• 1989: Tim Berners-Lee, a British researcher at the European Organization for Nuclear
Research (CERN), submits his proposal for what would become the World Wide Web. His
paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of
the Web.
• 1993: The Pentium microprocessor advances the use of graphics and music on PCs.
• 1996: Sergey Brin and Larry Page develop the Google search engine at Stanford
University.
History of Computer Technology – Late 20th Century
• 1997: Microsoft invests $150 million in Apple, which at the time is struggling financially.
This investment ends an ongoing court case in which Apple accused Microsoft of copying
its operating system.
• 1999: Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a
distance of up to 300 feet (91 meters).
History of Computer Technology – Late 20th Century
The first computer mouse was invented in 1963 by Douglas C. Engelbart and presented at the Fall Joint Computer
Conference in 1968 (Image credit: Getty / Apic)
History of Computer Technology – Late 20th Century
The Apple I computer, devised by Steve Wozniak, Steven Jobs and Ron Wayne, was a basic circuit board to which
enthusiasts would add display units and keyboards. (Image credit: Getty / Science & Society Picture Library)
List Down 5 Key Milestones in History of Computer Technology
during Late 20th Century.
Activity 2
History of Computer Technology – 21st Century
• 2001: Mac OS X, later renamed OS X then simply macOS, is released by Apple as the
successor to its standard Mac Operating System. OS X goes through 16 different
versions, each with "10" as its title, and the first nine iterations are nicknamed after big
cats, with the first being codenamed "Cheetah,“.
• 2003: AMD's Athlon 64, the first 64-bit processor for personal computers, is released to
customers.
• 2004: The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of
the first major challenges to Internet Explorer, owned by Microsoft. During its first five
years, Firefox exceeded a billion downloads by users.
History of Computer Technology – 21st Century
• 2006: The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-
based, dual-core mobile computer.
• 2009: Microsoft launches Windows 7 on July 22. The new operating system features the
ability to pin applications to the taskbar, scatter windows away by shaking another
window, easy-to-access jumplists, easier previews of tiles and more.
History of Computer Technology – 21st Century
• 2011: Google releases the Chromebook, which runs on Google Chrome OS.
• 2015: Apple releases the Apple Watch. Microsoft releases Windows 10.
• 2016: The first reprogrammable quantum computer was created. "Until now, there hasn't
been any quantum-computing platform that had the capability to program new algorithms
into their system. They're usually each tailored to attack a particular algorithm," said study
lead author Shantanu Debnath, a quantum physicist and optical engineer at the University
of Maryland, College Park.
History of Computer Technology – 21st Century
• 2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new
"Molecular Informatics" program that uses molecules as computers.
• "Chemistry offers a rich set of properties that we may be able to harness for rapid,
scalable information storage and processing," Anne Fischer, program manager in
DARPA's Defense Sciences Office, said in a statement.
• "Millions of molecules exist, and each molecule has a unique three-dimensional atomic
structure as well as variables such as shape, size, or even color. This richness provides a
vast design space for exploring novel and multi-value ways to encode and process data
beyond the 0s and 1s of current logic-based, digital architectures."
Video
Insert Text Here Insert Text Here Insert Text Here Insert Text Here
https://www.youtube.com/watch?v=OQax5NF5aEw
What is Information Technology?
• Information Technology (IT) refers to everything that businesses use computers for.
• By this definition, it could be said that Information Technology has been around some
65,000 years, the age estimation of the earliest known cave paintings where humans
were most likely recording what they saw in everyday life.
• From ancient cave drawings to alphabets and words, the complexity of what we could
create has only been limited by how long it takes us to produce improvements in
Information Technology.
• In 1945, the development of a stored program concept allowed for programs to be read
into a computer.
• Its significance may have not been completely realized at the time, but the possibilities of
Information Technology had grown exponentially because of it.
• This development laid the foundation for the unprecedented achievements that took place
in IT for the next 50 years.
Origins of Information Technology
• 1973 - Bob Metcalfe invents Ethernet (using a medium such as coax as an ether to send
and receive data).
• In 1993 - After developing the World Wide Web, CERN put the software in the public
domain making it free of charge for anyone to use.
• Mechanical (1450–1840)
• Electromechanical (1840–1940)
• The work of most organizations would be slow without functioning IT systems. It is very
difficult to find a business that doesn’t at least partially rely on computers and the
networks that connect them.
• Maintaining a standard level of service, security and connectivity is a huge task, but it’s
not the only priority or potential challenge in IT.
• More and more companies want to implement more intuitive and sophisticated solutions.
IT can provide the edge that a company needs to outsmart, outpace and out-deliver their
competitors.
Hardware & Software
• They are any part, component or device related to computers and their networks that can
be physically touched and manipulated. This includes hardware installed inside the
computer like the motherboard, central processing unit and hard drive.
• Hardware are also components that can be connected to the outside of a computer like a
keyboard, mouse and printer.
Hardware & Software
• Software encompasses all the data, application and programs stored electronically, like
an operating system or a video-editing tool.
• Some IT workers may spend more time working with configuring hardware components,
but those components are also governed by software.
Information Security
• Information Security refers to the processes and methodologies which are designed and
implemented to protect print, electronic, or any other form of confidential, private and
sensitive information or data from unauthorized access, use, misuse, disclosure,
destruction, modification, or disruption.
Technical Support
• It is a type of assistance that is given to computer users, as a result of needs or problems
that may arise with the software and with the hardware of their computers, networks, etc.
• The value of information is directly linked to how it helps decision makers achieve the
organization’s goals – distinguish data from information and describe the characteristics
used to evaluate the value of data.
• Technology has become a major portion of everyday life. People in nearly every job or
role are required to have some knowledge of computers and software. Information
Technology (IT) is used by organizations for a variety of reasons.
• Email was one of the early drivers of the Internet, providing a simple and inexpensive
means to communicate.
• Over the years, several other communications tools have also evolved, allowing staff to
communicate using live chat systems, online meeting tools and video-conferencing
systems. Voice over internet protocol (VOIP) telephones and smart-phones offer even
more high-tech ways for employees to communicate.
Information Technology & Its Role in Business
• Inventory Management systems track the quantity of each item a company maintains,
triggering an order of additional stock when the quantities fall below a pre-determined
amount.
• These systems are best used when the inventory management system is connected to
the point-of-sale (POS) system. The POS system ensures that each time an item is sold,
one of that item is removed from the inventory count, creating a closed information loop
between all departments.
Information Technology & Its Role in Business
• Today, most companies store digital versions of documents on servers and storage
devices. These documents become instantly available to everyone in the company,
regardless of their geographical location.
• Companies can store and maintain a tremendous amount of historical data economically,
and employees benefit from immediate access to the documents they need.
Information Technology & Its Role in Business
• Management Information Systems (MIS) enable companies to track sales data, expenses
and productivity levels. The information can be used to track profitability over time,
maximize return on investment and identify areas of improvement.
• Managers can track sales daily, allowing them to immediately react to lower-than-
expected numbers by boosting employee productivity or reducing the cost of an item.
Information Technology & Its Role in Business
• If a customer calls a call center with an issue, the customer support representative will be
able to see what the customer has purchased, view shipping information, call up the
training manual for that item and effectively respond to the issue.
• The entire interaction is stored in the CRM system, ready to be recalled if the customer
calls again. The customer has a better, more focused experience and the company
benefits from improved productivity.
Information Technology & Its Role in Business
Manufacturing
• IT has been extensively used for processing customer orders, controlling inventory levels,
developing production schedules and for monitoring product quality. A whole new
discipline — Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM)
has evolved due to application of IT to design and manufacturing.
Project Management
• It can be defined as the discipline of applying specific processes and principles to initiate,
plan, execute and manage the way that new initiatives or changes are implemented within
an organization.
Information Technology & Its Role in Business
Data Analysis
• Investment firms heavily use information systems to analyze stocks, bonds and options to
provide better service to their clients.
Concepts of Information Technology
Technology
• Computer networks are systems of information processing components that are a variety
of hardware, software and telecommunication technology.
Application
• That electronic business and commerce application involves interconnected business
information system
Concepts of Information Technology
Development
• That developing way to use IT in business includes designing the basic component of
information system.
Management
• Managing IT emphasize the quality, strategic business value and security of an
organization in information system.
IT Devices & Networks Create More IT Jobs
• With all this tech, lots of humans are needed to create, install, maintain, and protect it all.
• In the United States in 2020, there were 3.9 million postings for tech occupation job
openings. This trend is showing no signs of slowing down, with shortages of workers in all
six sectors of Information Technology.
IT Support Technicians
• IT Support Technicians assists individuals having technical problems with hardware and
software.
Jobs Related to Information Technology
Networking Technicians
• Set up, administer, maintain and upgrade networks, allowing devices to interact with
networks.
Programmers
• Programmers write and test the code that makes up software programs.
Web Developers
• Web Developers build websites and the infrastructure behind them.
Quiz
• Provide 3 examples of Information & its Data.