Home  

T 171  

TMA 01 TMA 02 TMA 03 TMA 04


You are welcome to T171 webpages


Model One Exam Review
Model Two Exam Review

Model Three Exam Review

About | Policy | e-mail

T171 YOU, Your Computer and the Net

Course overview

How did the PC and the Internet arrive at their present state? what are the implications of the computer revolution? How does the Internet affect business? This course answers these questions and helps students develop an understanding of the computer industry, the Internet and e-business. It will help them use the computer for effective information searching and analysis. Students need to be familiar with the windows environment and have some experience of navigating the web. The teaching and assessment for T171 are entirely on line.

  

Model One 1-Hardware Basic computer unit (microprocessor, memory and hard disk). 2-Peripherals Separate added hardware for input output (monitor, printer and keyboard). 3-Software Programs to operate and make use of computer (operating system, programming language and applications). 4-OS, Operating System (tasks) - BIOS Are essential to run computers and have the following role: (dos, windows, UNIX, Mac OS) - manage computer resources. - run applications. - provide user interface. -BIOS: basic input output system, stored in ROM, manages data flow between operating system and input output devices. 5-Transistors Are tiny two state devices and are basic elements in computer integrated circuit. Microprocessor chip consists of millions of transistors, computers work by manipulating 1s and 0s. 6-Benefits of being digital - digital devices are often faster and more efficient than analogue equivalents. - digital devices are more reliable and less susceptible to noise. 7-Binary system Used in computers to represent both data and instructions using 1s and 0s. Data; numbers-text-images. Instructions; commands to manipulate data. 8-PC components - CPU; central processing unit. Has; ALU arithmetic logic unit + control. unit + memory - Motherboard or printed circuit board (PCP). - Clock chip: synchronize computer operations. - Input output devices: for data input output. - Memory (RAM + ROM). - Permanent data storage (hard disk, CD-ROM, floppy and USB). - Buses: cables to send and receive data between CPU and components. 9-Computer power is affected by... - Microprocessor chip (more transistors to perform more tasks). - Clock speed. - Word size. - RAM (computer working memory). 10-Mainframes - Large computers that carry different tasks. - Very expensive. - Software is customized to the needs of customer. - Used by large companies, universities and public institutions. - Needs to operate by specialists. - Maintenance is major task. - Long-term relationship between customer and supplier. 11-PC, personal computers - Intended for personal use. - No need to be specialist to operate. - Maintenance is not a big deal. - Relatively cheap. - Customers, wide variety of people. - Replaced frequently. - Commonly available software. - Short-term relationship between customers and supplie.r 12-Analytical engine - Invented by Charles Babbage in 1822. - General purpose device, capable to perform arithmetic and logical functions. - Instructions were entered by punch cards. - Based on mechanical movements. 13-Turning machine - Paper published by Alan Turning in 1936. - General purpose devise. - Based on mechanical movements. - Instructions were entered by tape stream. - It was the basic for computers which were to follow. 14-ENIAC - Electronic numerical integrator and computer. - Von Neumann joined ENIAC project in 1944. - First large scale computer that used Vacuum tubes instead of electromechanical relays. - Vacuum tubes performed the same functions that transistors do in a modern microprocessor. - Capable of performing complex calculations. - It was released in 1944. - Limits: very little memory and reprogramming it means replugging 6000 switches. 15-EDVAC - Electronic discrete variable automatic computer. - Designed by ENIAC's team. - It has stored programmable memory. - Von Neumann published the EDVAC report that described the structure of the computer that now followed by modern computers. 16-IBM mainframes - IBM launched model 702 in 1953. - Were expensive ~ 1000.000$. - It's development continued into the 1970's, with IBM coming to dominate the market. 17-Tasks of microprocessor It is an integrated circuit that can be programmed: - Reading and writing from and to memory (RAM). - Manipulating information. - Communicating with other parts of the computer. 18-Programming microprocessor - Machine code (basic set of instructions, sequences of 1s and 0s). - Assembly language (translate instructions into machine code). - High level languages (make programmer's task easier, have more understandable instructions). 19-Moore's law - The numbers of transistors that can be placed on the same area of a microprocessor doubles every 18 months. - The price of computers stay about constant, but their power increases. 20-Microprocessor history 1971 - In 1971 Ted Hoff an Intel employee found the idea and design of microprocessor. - First microprocessor was called 4004. - Consist of 2300 transistors. - Texas instruments invented a microprocessor at the same time (1971), but never got it to the market. - Faggin was the person who led the production of the chip by developing new technology. 21-Microsoft mission - Mission: "A Computer on every desk and in every home, running Microsoft software". - Motto: "We set the standard". 22-Bootstrapping Is a small program in the ROM that allow the computer to start up. 23-Types of OSs - Text based operating systems - CP/M, DOS. - Graphical operating system - Windows, XP and Mac operating system. 24-Apple mission Mission: "Make computer accessible to everyone and make it very easy to use". 25-Characteristics of the new companies - Very informal culture. - Young talented employees. - Intellectual creativity. - Incredible success. 26-Xerox, PARAC 1970 Xerox in 1970, set up a research group at Palo Alto research Center (PARAC). The team was composed of the most brilliant technologies. Its aim was to research the developments of computer"Arial". 27-Good ideas in computing / Xerox Nearly all the good ideas in computing can be traced back to Xerox: - GUI: Graphical user interface. - Ethernet. - Object oriented programming. - Laser printer. 28-WYSIWYG It allows you to see a document on screen more or less as it will appear when printed. It needs a GUI. 29-GUI In a GUI the cursor can be moved. text can be highlighted, open at the same time several documents, draw graphics...etc. 30-Bit-mapping technology Bit- mapping technology treats the screen as thousands of pixels/picture elements. First WYSIWYG were simply black and white, for colored WYSIWYG screens each pixel need to be represented with more bits. 31-ALTO Was the machine that: - Use GUI/first. - Developed by PARAC's team. - Never marketed. - Seen by a team from Apple in 1977 - most notably Steve Job. 32-Benefits of networks - Sharing resources (hardware and software). - Communication (E-mail and chat). - File access (store and retrieve files / back-up). 33-Node Each element on the network (PC, mainframe, printer...etc...) is referred to as a node. 34-Types of networks - Star. - Ring. - Bus. - LAN (Local area network). - WAN (Wide area network). 35-Network software / protocols When running a network, you need a network software that have the role to enforce rules which coordinate the traffic of the network. There rules are called protocols. 36-Benefits of protocols - Avoid collision. - Allow smooth flow of traffic. - Avoid blockage. - Ensure that the data is intelligible to all the nodes on the network. - Make sure that the data is not corrupted. 37-Ethernet - Network protocol for LAN. - Operate on a bus network. - Most popular method of LAN protocol because it is cheap, fast and reliable. 38-Management in IT industry Two types of management: *Vertical management - Many levels separating bosses from workers. - Inadequate for PARC's environment. - Does not encourage creativity. - Decisions take lot of time. *Horizontal management - One manager. - Practical for small organizations. - Practical for computing industry. - Fewer levels of hierarchy. - Anyone can become a manager. 39-IBM International Business Machines, referred to as the Big Blue. It was widely regarded as having excellent: - Quality of service. - Products. - Management. 40-IBU The concept of a small development team within the larger structure was called IBU. Independent business unit. 41-Why do we need a standard? To enable various components and different devices for a certain industry to work together, they need a set of rules or standards that they all could agree on. 42-Open architecture Meant that the components made up the machine were bought from another suppliers and their design was available for anyone else to use. This lead to an open standard which made the IBM PC the de facto standard for microcomputers. 43-Differences between IBM & Microsoft - Number of customers. - Sales and profits. - Customer relations. - Employees type. 44-IBM Computers which are identical to the IBM PC in every important respect are called Clones. - Same hardware architecture. - Same BIOS. - Same software. 45-Reverse engineering Is to try to build a function that does exactly the same thing as another function, without breaking any copyright laws that exist for that other function. 46-Why IBM lose its market share? - Using open standard for PC architecture. Key to the success of the PC and the problem for IBM. - Decision to allow MS to sell dos. - The surprising success of reverse engineering. - the speed with which clones appeared. - The delay in releasing 80386 based computer. 47-Reasons for a success of a product - Technological superiority: the product with the best technology has a very good chance to success. - Legacy: the product need to be compatible with older versions of the same product. - Market leader: the market leader always has the advantage over his competitors. - Society acceptance: sometimes good technologies products never make it because people either were not in use of such a product or the price of that product was too high. - Product decision: decisions made when the product is developed may effect the success of that product. - Marketing: in the PC industry, the marketing campaign alone can make or break the product. 48-Apple II - Produced in 1977. - Best seller of its time. 49-Apple III - Had some initial flows. - It did not met the expectations. - Its sales was very weak compared to Apple II. 50-Lisa - Was released in 1983. - Very slow and expensive. - Motorola CPU 68000. - 1 MB of memory. - Hard disk. - GUI. - New expression/metaphor: Desktop. 51-Desktop metaphor / Macintosh In this metaphor the screen resembles the top of a desk, with various files on it and even a waste basket. 52-Page maker Macintosh lack was its compelling application. A company name Aldus produced the desktop publishing software, the Page maker. With the page maker people were able to produce magazines and newspapers. The Macintosh was very suitable for the Page maker which required GUI. 53-Empowerment Simply means giving power to someone, and the personal computer was seen as a the means by which power was transferred from large organizations to individuals. 54-How Apple affected people's life? - Ease to use and accessing information like using accounting software. - PC provides a tool to help people be more creative like using drawing software. - Be able to have your own business anywhere in the world through using the internet. 55-Reasons for Apple decline - The decision not to make an open industry standard. - Apple had many leaders come and go after Steve Jobs left. - The price of Apple computers were always higher than an equivalent PC. 56-NOS / Network operating system Network operating system allow the PC to interact with other computers. NOS should perform the same functions as a regular one and has some additional features: - Mult-user: the OS needs to be able to allow several users to access the computer's resources. - Multi-tasks: dividing up the CPU's time between various tasks. - Portable: be able to work across a variety of computer architecture. - Secure: a consequence of having multiple users and being on a network is that the system needs to be secure. - Compatible: with other operating systems. - Safe: not just from unwanted intruders, but also from "natural" disasters such as power failures.

TOP

Model Two 1-Host Connects other computers to the internet and act as a gate way between the LAN and the global one. 2-IP address Each computer connected to the internet has a unique address which enables any other computer to locate it and communicate with it. The IP address is composed of 4 numbers (4 bytes). Ex. 198.200.17.50. 3-DNS, Domain Name Server Translates a domain name (cnn.com) into an IP address. 4-URL, Uniform Resource Locater
URL

http:

//www. cnn.com
Uniform resource locater/ Hyper text transfer protocol/ World wide web/ Domain name
5-System System is an assembly of componenets connected together in an organized way. The components are affected by being in the system and may cahnged if they leave it. 6-System thinking System thinking is a way of looking at the world and things by concentrating on the whole rather than individual parts. 7-Benefits of system thinking - Focus on the whole properties rather than the components. - Pay greater attention to the ways components interact. - Take multiple partial views of things. - Look at things from many different perspectives. 8-What is the "internet" It is a global network of computer networks. 9-Where the internet came from? - Prehistory: ideas which led to the internet, or which inspired the individuals who conceived and built the system. - ARPA and ARPANE: The Advanced Research Project Agency, was a special agency within the US Department of Defence, set up to fund and faster advanced research in a number of areas, including computing. In 1966 the agency decided to construct a wide area network which would link ARPA-funded research laboratories across the US. the ARPANET was designed and built between 1967 and 1972. - The internet as we know it today evoloved from the ARPANET. The dirve was to find a way of linking different networks together into a "network of networks". - The World Wide Web: was invented in CERN in Geneva in 1989. The technical infrastructure for the web was in existence by 1991. But it was not until 1993 with the launch of the first big "browser" programs the web took off. - Usenet news: Global online conference "newsgroup" in which much of the discussions that is hold in the internet take place. - The Open Source Movement: Social movement consists mainly of expert programmers who share a particular view regarding intellectual property and who have evolved a distinctive way of developing software. 10-Time sharing The computer passes control from one program to the other in a "circular" manner so every user thinks the computer is serving him only. 11-Decentralization Means that nodes communicate with their neighbors with no central control. 12-Redundancy / distributed Means that have more connections than you stricly needed for a normal comunications, will enable decentralized network to function under faulty situation. 13-Analog communication A physical connection has to be established between the communicating points. "Ciruit-switching" the switched connection should remain active until communication is done. 14-Digital communication No direct physical connection between the communication points. The sender's message converted into sequences of 0s and 1s. Sequences of 0s and 1s is broken into equal pieces called packets. Packets are transmitted to the reciever one after another. Packets are collected at the recieving end to assemble the original message. This is called packet-switching. 15-IMPs Interferance Message Processors, a small computer that is inserted between each host and the network of transmission lines, they act as the host interface to the network. They are linked together to form a sub-network of IMPs. 16-Advantages of IMPs *Free host computer from the routing load such as: - dividing the message into packets. - routing the packets. - Gathering the packets. *Same routing routing programs at each site since we have identical devices. 17-NWG The Network Working Group, it is a name given to a group of graduated students from two universities which sit together to discuss what application would be built on the ARPANET. 18-RFC Request for Comments, are technical notes written by the NWG to exchange ideas on the design of the host software for the ARPANET. 19-Benefits of RFC - They promoted cooperative and open work methods. - They are timely documents containing information that comprise the consensus of the network developers. - RFCs present an accurate trace on how internet software evolved. 20-Why internet protocols? - Provide rules that govern how computers communicate. - Give the meaning to the bit flowing between the IMPs. - Concerned with passsing messages. - Specify the format that a message must take, and the way in which computers must exchange a message within the context os a particular activity. 21-SMTP Simple mail transfer protocol, used to send and recieve messages. 22-FTP File transfer protocol, used to transfer files between computers. 23-Telnet Used for logging into remotr hosts. 24-HTTP Hypertext transfer protocol, used to transmit information on the WWW. 25-NNTP Network new transfer protocol, used to trnasmit network news. 26-NCP Network contril protocol, enabled different hosts on the network to communicate. 27-E-mail Is one of the most important application develped on the ARPANET. It was developed in 1970. It is a method of machine-to-machine message exchange. The first e-mail was written by Ray Tomlinson (a hacker). - In 1970 ARPANET community formalized an e-mail protocol. - In 1975 a revised e-mail protocol has been introduced in RFC 680. - Final revision of the e-mail protocol was done in 1977, since this date this protocol remained unchnaged. 28-Building blocks for internet - Gateways. - TCP (Transmission Control Protocol). - IP (Internet Protocol). 29-Other packet switching networks - British NPL network. - Cyclades network in France. - Aloha packet-radio network in Hawaii. - SATNET Satellite network. 30-How to connect incompatible networks To connect incompatible networks together: - Using computers known as Gateways between different networks. - Making hosts responsible for end-to-end transmission of packets, with error correction and retransmission if necessary. - Devising the protocols necessary for performing the previous two tasks. 31-How internet works Interent works by breaking long messages into smaller chunks called packets. 32-Packets Packet: is a string of bits divided into different segments. 33-Payloads Payloads: core of the packet, a data segment. 34-Header and trailers Headers and trailers: extra information added to the data by different protocol layers. By SMTP, TCP, IP and Etherent. 35-Application layer This is where the user interact with the network. - SMTP: Simple mail transfer protocol, used to send and recieve electronic mail. - HTTP: Hypertext transfer protocol, allows you to get pages from the WWW. - FTP: File transfer protocol, used to transfer files between computers. 36-Transport layer This is where TCP resides, its job is to ensure the reliability and integrity of messages. TCP: Transmission control protocol: - Deals with assembly and disassembly of packets. - Error detection, correction and retransmission if necessary. - Flow control, regulate the transmission rate. - Sequencing, it ensures that packets are stored out into the same order as they were transmitted. 37-Network layer Is responsible for figuring out how to get packets to their destination. IP: Internet protocol, handle addressing of packets. 38-Link layer Is responsible for communicating with the hardware which connects your machine to the internet. PPP: Point-to-point protocol, governs the transmission of IP packets over serial lines. 39-Usenetnews It is a way of exchanging news and opinions. It is a program that enabled people to post articles to a shared location which can be read by many other people that have access to that shared location. - each location is called a news group. - each news group will be restricted to a certain subject. - usenet was emerged from the community of researches and programmers who used UNIX OS. 40-UNIX UNIX, is a multi-user time sharing OS. It allows more than one user to work on the machine at the sametime by sharing the CPU time. - It was created at Bell labs for AT&T company. - It was written by Ken Thompson and Denis Ritchie. 41-Properties of UNIX Two main properties distinguish UNIX from other OS: - Existence of Kernel within UNIX OS. - Final stages of UNIX is written in "C" which is a hhigh level language that is easy to understand and use compared to Assembly Language which was difficult to understand and machine specific. 42-Kernel Is an isolated small piece of code that can be easily placed in another machine which will then run UNIX. 43-UUCP UNIX-TO-UNIX copy program, enables users of UNIX to import new UNIX programs and releases as well as exchange some common discussions through a phone line. 44-Fidonet One of the first programs that was mainly connecting PC home users. It allows them to recieve and store information, and at the end of the day send it to other nodes in the network through a phone line. 45-Open source movement - Was established by Richard Stallman. - Its main idea is to have the software and its source code available for programmers and developers to benifit from and improve in a cooperative manner. 46-Copeyleft Enforce people using such free software to make it available to others, even if they did significant improvements. 47-Linux - It was developed by Linus Travolds. - It was developed on cooperative development, the copyleft principle and the efforts of Richard Stallman. - It is considered as one of the best networking OS which is available free of charge. 48-Vannevar Bush In 1945 described in an article an information storage and retival machine that he called Memex which looks like an early description of the web 49-Douglas Engelbart Who invented many of the fundamentals of modern PCs such as computer mouse, bit-mapped screens, had proposed a method for linking documents similar to the web. Hypertext document is a document that has many links that can read some text and then jump to another text based on some links. 50-Ted Nelson Invented the Hyper concept. He worked on a Xanuda project (took more than 30 years) for constructing a global hypertext publishing system. 51-Bill Atkinson Invented the Hypercard which was the first simple hypertext system for personal computers. These cards have used "hot-spots" which if you click on, would instantly cause a jump to another card. A collection of linked cards was called a "stack". The key card in the stack is called the "Home card". 52-Hypertext system Is a system that presents information in a simple format and maintains links which points to other pages on different machines or other pages on the same web page. 53-Hypermedia system Is a simple extension of hypertext where text is not the only way to create links: images, sound, animations and video can have links to another pages. 54-Characteristics of Hypertext system - allow remote access. - allow access of the same information from different types of computer systems. - be non-centralized. - allow users to add their own provate links to and from public information. - allow access to existing data. - allow live links to be made between dynamically changing data. 55-Browser Is a cleint program which communicates with a server a nd provides a window through which the user can see the set of linked hypertext documents or resources. 56-Types of browsers *Text based browsers: Are for text only pages. They do not handle images. They are still in use because: - Smaller. - Faster. - Requires less RAM and resources. - Can run on mobile phones. *Graphical based browsers: - Can handle all types of information including images. Ex. Mosia, Opera, Netscape Navigator and Internet Explorer. 57-Protocols For enabling different machines to talk to one another, Tim Berners Lee came up with a set of rules or protocols: - HTTP, Hypertext transfer protocol, specifies how information exchange between machines on the web should be handled. It defines how the four stages of a web transaction should be done. (Connection/Request/Reply/End) - URL, Uniform resource locator, defines a uniform way of specify the location and which information was held. - HTML, Hypertext mark-up language, it represents a uniform way of structuring web pages on the web. It consists os a set of setups for attaching tags into text. A tag consists of "<", a directive and a "/>". - Client-server model treats computers as two types; * A set of client computers requesting some resources which could be files or web pages from a server computer. * A server computer which will respond to the client(s) request(s) based on some common protocols. 58-Mosiac browser - The first browser program to run on regular PCs. - A GUI and hence very easy to use. - Designed to interpret images embedded in web pages 59-Netscape Navigator Is another browser written by Andreeson who with his team started from scratch and wrote compltely new code so as not to violate copyright of the original Mosiac code. This lead to more clean, secure and sophisticated packge which was given free to regular users. 60-Internet Explorer Microsoft boought the right of Mosiac and distributed it with its windows OS. This created a kind of war between the two companies and resulted in a major law suit against Microsoft. 61-Collaborative Hypertext Is the process by which people are allowed to add their links to a hypertext document and help each other in producing such a document. 62-W3C World Wide Web Consortium, its objective is to lead the WWW to its full potential by developing common protocols that promote its evolution. - Universal access: which allow web to be available for all people regardless of the software, hardware, network, language and culture. - Semantic web: which enable computers to interpret and exchange information. - Trust, that makes the web as truly collborative medium. - Interoperability, which allow web products to communicate and work together by adopting common open protocols. - Evolvability, which makes the design simple, modular and compatible. - Decentralization, which limits the number of central web facilities. - Cooler multimedia, which develops a more user friendly web. 63-Markup language 64-SGML 65-XML 66-XML vs. HTML 67-DTD 68-Meta Data 69-XSL 70-Subject approach 71-Keyword approach 72-P2P network

TOP

Model Three 1-Information richness 2-Information reach 3-Richness vs. reach concept 4-Electronic shopping test 5-Deconstruction 6-Impact of internet on e-business 7-Ways the internet chnage traditionl business 8-Internet initial design security limitation 9-Encryption 10-Encryption requirments 11-Types of encryption 12-Public key cryptograph 13-Security protocols 14-Disintermediation 15-Reintermediation 16-Search engines 17-Web directories 18-Review or link site 19-Portal sites 20-Intelligent agents 21-Ways of making money on the inetrnet 22-E-commerce and E-business 23-EDI 24-Advantages of EDI 25-Dis-advantages of EDI 26-ANX 27-Differece between extranet and intranet 28-Changes within organisation 29-Asynchronous communication 30-Needs for synchronous communication 31-New consumer models 32-Collaborative filtering 33-Online auction sites 34-Consumer communities sites 35-Alternative communities

TOP

 © All Rights Reserved