Thursday, February 18, 2010

Computers Ethics & Legal Issues

ASSIGNMENT 5

COMPUTERS ETHICS & LEGAL ISSUES


1.1 UNETHICAL CODES OF CONDUCTS

With the advancement of ICT, it is easy for anyone to retrieve your information from the Internet. You may not realize that when you fill a form on the Internet, your information may be exposed and stolen. Examples of unethical computer code of conducts include:

•modifying certain information on the Internet, affecting the accuracy of the information.
•selling information to other parties without the owner’s permission.
•using information without authorization.
•involvement in stealing software.
•invasion of privacy.

1.2 INTELLECTUAL PROPERTY

Intellectual property (IP) is a term referring to a number of distinct types of legal monopolies over creations of the mind, both artistic and commercial, and the corresponding fields of law. Under intellectual property law, owners are granted certain exclusive rights to a variety of intangible assets, such as musical, literary, and artistic works; discoveries and inventions; and words, phrases, symbols, and designs. Common types of intellectual property include copyrights, trademarks, patents, industrial design rights and trade secrets in some jurisdictions.
Although many of the legal principles governing intellectual property have evolved over centuries, it was not until the 19th century that the term intellectual property began to be used, and not until the late 20th century that it became commonplace in the United States.[2] The British Statute of Anne 1710 and the Statute of Monopolies 1623 are now seen as the origin of copyright and patent law respectively.[3]

1.3 PATENTS FOR INVENTIONS

Algorithm or function, but the first to file for it. In the United States, a patent excludes Patents are granted by a governing body, for example, the Patent and Trademark Office of the U.S. government or European governments. The patent is not given to the first to invent the anyone from making, using, or selling the idea covered by the patent. It guarantees legal remedies over a period of twenty years within the United States and its territories. The specific criteria for a patent vary country by country, but it is generally recognized that the invention must be novel, useful, and not obvious. One example where a patent is used rather than other legal options is that of an encryption algorithm. By acquiring a patent on an encryption algorithm, the inventor has full control over how it can be used or implemented in software. Another example is the formula for a prescription drug.

1.4 TRADEMARK FOR BRAND IDENTITY

A brand is a name used to identify and distinguish a specific product, service, or business. A legally protected brand name is called a proprietary name.

1.5 DESIGN FOR PRODUCTS APPEARANCE

The appearance attributes of designed products noted in the literature often reflect what designers themselves perceive in a product design. This present research, however, provides knowledge on how consumers perceive product appearance by identifying appearance attributes that consumers use to distinguish the appearances of durable products. Descriptions of appearance were generated by consumers in a free categorization task. The descriptions were classified as the attributes Modernity, Simplicity and Playfulness. These attributes were confirmed in a separate rating-task performed by a second group of consumers. The attributes proved stable across different groups of consumers indicating that they are universal. Additionally, the attributes were validated across different product categories and are thus generalizable and not product category specific. The appearance attributes identified in this research provide knowledge of what consumers see in durable product appearance. Knowledge of what appearance attributes are perceived by consumers in a product design can help a designer to communicate certain pre-specified meanings in a product.

1.6 COPYRIGHT FOR MATERIAL

Copyright is the set of exclusive rights granted to the author or creator of an original work, including the right to copy, distribute and adapt the work. Copyright lasts for a certain time period after which the work is said to enter the public domain. Copyright applies to a wide range of works that are substantive and fixed in a medium. Some jurisdictions also recognize "moral rights" of the creator of a work, such as the right to be credited for the work. Copyright is described under the umbrella term intellectual property along with patents and trademarks.
The Statute of Anne 1709, long title "An Act for the Encouragement of Learning, by vesting the Copies of Printed Books in the Authors or purchasers of such Copies, during the Times therein mentioned", is now seen as the origin of copyright law.
Copyright has been internationally standardized, lasting between fifty and one hundred years from the author's death, or a shorter period for anonymous or corporate authorship. Some jurisdictions have required formalities to establish copyright, but most recognize copyright in any completed work, without formal registration. Generally, copyright is enforced as a civil matter, though some jurisdictions do apply criminal sanctions.

Tuesday, February 16, 2010

Quiz (The Impact Of Ict On The Society)

QUIZ THE IMPACT OF ICT ON THE SOCIETY

NAME: NUR FARZANA BINTI MOHD MAHDI
CLASS: 4 BESTARI
DATE: 09 FEBRUARY 2010

Activity 1
Choose wither true or false.

1. B
2. A
3. A
4. B
5. B
6. A
7. B
8. A
9. A
10. B

Activity 2
Fill in the box with the right answer provided below.

Social Problems
~people are interested to communication through online chatting rather than having real time.

Speed
~bordeless communication.

Health Problems
~using computers frequently and for long hours is harmful to one’s health.

Sharing
~exchanging of information through the internet-discussion groups, mailing list and forums.

Parpeless Enviroment
~information travels faster like sending e-mails to friends or business partners.

Globalisation
~information is stored systematically and can be retrived at any time.

Friday, February 5, 2010

the impact of ict on the society

ASSIGNMENT 4

THE IMPACT OF ICT ON THE SOCIETY

Advantage

1.0 FASTER COMMUNICATION SPEED

1.1 In the past, it took a long time for any news or messages to be send. Now with the Internet, news or messages are sent via e-mail to friends, business partners or to anyone efficiently. With the capability of bandwidth, broadband and connection speed on the Internet, any information can travel fast and at an instant. It saves time and is inexpensive.

2.0 LOWER COMMUNICATION COST

2.1 Using the Internet is cost-effective than the other modes of communication such as telephone, mailing or courier service. It allows people to have access to large amounts of data at a very low cost. With the Internet we do not have to pay for any basic services provided by the Internet. Furthermore, the cost of connection to the Internet is relatively cheap.

3.0 RELIABLE MODE OF COMMUNICATION

3.1 Computers are reliable. With the internet, information could be accessed and retrieved from anywhere and at anytime. This makes it a reliable mode of communication. However, the input to the computer is contributed by humans. If the data passed to the computer is faulty, the result will be faulty as well. This is related to the term GIGO.
GIGO is a short form for Garbage In Garbage Out. It refers to the quality of output produced according to the input. Normally bad input produces bad output.

4.0 EFFECTIVE SHARING OF INFORMATION

4.1 With the advancement of ICT, information can be shared by people all around the world. People can share and exchange opinions, news and information through discussion groups, mailing list and forums on the Internet. This enable knowledge sharing which will contribute to the development of knowledge based society.

5.0 PAPERLESS ENVIRONMENT

5.1 ICT technology has created the term paperless environment. This term means information can be stored and retrieved through the digital medium instead of paper. Online communication via emails, online chat and instant messaging also helps in creating the paperless environment.

6.0 BORDERLESS COMMUNICATION

6.1 Internet offers fast information retrieval, interactivity, accessibility and versatility. It has become a borderless sources for services and information. Through the Internet, information and communication can be borderless.

Disadvantage

7.0 SOCIAL PROBLEMS

7.1 There are some negative effects of ICT. It has created social problems in the society. Nowadays, people tend to choose online communication rather than having real time conversations. People tend to become more individualistic and introvert.
Another negative effect of ICT is :
• fraud
• identity theft
• Pornography
• Hacking
This will result a moral decedent and generate threads to the society.

8.0 HEALTH PROBLEMS

8.1 A computer may harm users if they use it for long hours frequently. Computer users are also exposed to bad posture, eyestrain, physical and mental stress. In order to solve the health problems, an ergonomic environment can be introduced. For example, an ergonomic chair can reduces back strain and a screen filter is used to minimize eye strain.

computerized & non-computerized system

ASSIGNMENT 3

COMPUTERIZED & NON-COMPUTERIZED SYSTEM

1.0 Banking system

1.1 Shadow banking institutions are typically intermediaries between investors and borrowers. For example, an institutional investor like a pension fund may be willing to lend money, while a corporation may be searching for funds to borrow. The shadow banking institution will channel funds from the investor(s) to the corporation, profiting either from fees or from the difference in interest rates between what it pays the investor(s) and what it receives from the borrower.
By definition, shadow institutions do not accept deposits like a depository bank and therefore are not subject to the same regulations. Familiar examples of shadow institutions included Bear Stearns and Lehman Brothers. Other complex legal entities comprising the system include hedge funds, SIVs, conduits, money funds, monolines, investment banks, and other non-bank financial institutions.

2.0 Manufacturing system

2.1 Manufacturing is the use of machines, tools and labor to make things for use or sale. The term may refer to a range of human activity, from handicraft to high tech, but is most commonly applied to industrial production, in which raw materials are transformed into finished goods on a large scale. Such finished goods may be used for manufacturing other, more complex products, such as household appliances or automobiles, or sold to wholesalers, who in turn sell them to retailers, who then sell them to end users - the "consumers".

Manufacturing takes turns under all types of economic systems. In a free market economy, manufacturing is usually directed toward the mass production of products for sale to consumers at a profit. In a collectivist economy, manufacturing is more frequently directed by the state to supply a centrally planned economy. In free market economies, manufacturing occurs under some degree of government regulation.

Modern manufacturing includes all intermediate processes required for the production and integration of a product's components. Some industries, such as semiconductor and steel manufacturers use the term fabrication instead.

The manufacturing sector is closely connected with engineering and industrial design. Examples of major manufacturers in the United States include General Motors Corporation, Ford Motor Company, Chrysler, Boeing, Gates Corporation and Pfizer. Examples in Europe include Airbus, Daimler, BMW, Fiat, and Michelin Tyre.

2.1.1 The chaging methods of macnufacturing

• Craft or Guild system
• Putting-out system
• English system of manufacturing
• American system of manufacturing
• Soviet collectivism in manufacturing
• Mass production
• Just In Time manufacturing
• Lean manufacturing
• Flexible manufacturing
• Mass customization
• Agile manufacturing
• Rapid manufacturing
• Prefabrication
• Ownership
• Fabrication
• Publication

3.0 Commerce system

3.1 E-COMMERCE

3.1.1 E-commerce helps in boosting the economy. It makes buying and selling activities easier, more efficient and faster. For this application, computers, Internet and shared software are needed.

In the e-commerce sector, customers, suppliers and employees benefits from the usage of ICT.

3.1.2 DESCRIPTION OF COMMERCE
Commerce is an activity of exchanging, buying and selling of commodities on a large scale involving transportation from place to place.

3.2 COMMERCE BEFORE ICT
• Trading was made using the barter system and it was then later developed into currency.
• Advertisement was in the form of word of mouth, billboards and printed flyers.
• Trading globally was extremely slow, late and expensive. Traders had to find ways to market local products in the global market.

3.3 COMMERCE WITH ICT
E-commerce plays an important role in the economic scene. It includes distribution, buying, selling and servicing products that are done electronically.

evolution of computer

ASSIGNMENT 2

EVOLUTION OF COMPUTER

1.0 1st Generation Of Computer

1.1 History Of Computer Hardware
The history of computing hardware is the record of the constant drive to make computer hardware faster, cheaper, and store more data.
Before the development of the general-purpose computer, most calculations were done by humans. Tools to help humans calculate are generally called calculators. Calculators continue to develop, but computers add the critical element of conditional response, allowing automation of both numerical calculation and in general, automation of many symbol-manipulation tasks. Computer technology has undergone profound changes every decade since the 1940s.
Computing hardware has become a platform for uses other than computation, such as automation, communication, control, entertainment, and education. Each field in turn has imposed its own requirements on the hardware, which has evolved in response to those requirements.
Aside from written numerals, the first aids to computation were purely mechanical devices that required the operator to set up the initial values of an elementary arithmetic operation, then propel the device through manual manipulations to obtain the result. An example would be a slide rule where numbers are represented by points on a logarithmic scale and computation is performed by setting a cursor and aligning sliding scales. Numbers could be represented in a continuous "analog" form, where a length or other physical property was proportional to the number. Or, numbers could be represented in the form of digits, automatically manipulated by a mechanism. Although this approach required more complex mechanisms, it made for greater precision of results.
Both analog and digital mechanical techniques continued to be developed, producing many practical computing machines. Electrical methods rapidly improved the speed and precision of calculating machines, at first by providing motive power for mechanical calculating devices, and later directly as the medium for representation of numbers. Numbers could be represented by voltages or currents and manipulated by linear electronic amplifiers. Or, numbers could be represented as discrete binary or decimal digits, and electrically-controlled switches and combinatorial circuits could perform mathematical operations.
The invention of electronic amplifiers made calculating machines much faster than mechanical or electromechanical predecessors. Vacuum tube amplifiers gave way to discrete transistors, and then rapidly to monolithic integrated circuits. By defeating the tyranny of numbers, integrated circuits made high-speed and low-cost digital computers a widespread commodity.
This article covers major developments in the history of computing hardware, and attempts to put them in context. For a detailed timeline of events, see the computing timeline article. The history of computing article treats methods intended for pen and paper, with or without the aid of tables. Since all computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of computer data storage is tied to the development of computers.

1.2 Before computer hardware
The first use of the word "computer" was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued to be used in that sense until the middle of the 20th century. From the end of the 19th century onwards though, the word began to take on its more familiar meaning, describing a machine that carries out computations.[1]

1.3 Punched card technology
In 1801, Joseph-Marie Jacquard developed a loom in which the pattern being woven was controlled by punched cards. The series of cards could be changed without changing the mechanical design of the loom. This was a landmark point in programmability.
In 1833, Charles Babbage moved on from developing his difference engine to developing a more complete design, the analytical engine, which would draw directly on Jacquard's punched cards for its programming.[19] In 1835, Babbage described his analytical engine. It was the plan of a general-purpose programmable computer, employing punch cards for input and a steam engine for power, using the positions of gears and shafts to represent numbers. His initial idea was to use punch-cards to control a machine that could calculate and print logarithmic tables with huge precision (a specific purpose machine). Babbage's idea soon developed into a general-purpose programmable computer, his analytical engine. While his design was sound and the plans were probably correct, or at least debuggable, the project was slowed by various problems. Babbage was a difficult man to work with and argued with anyone who didn't respect his ideas. All the parts for his machine had to be made by hand. Small errors in each item can sometimes sum up to large discrepancies in a machine with thousands of parts, which required these parts to be much better than the usual tolerances needed at the time. The project dissolved in disputes with the artisan who built parts and was ended with the depletion of government funding. Ada Lovelace, Lord Byron's daughter, translated and added notes to the "Sketch of the Analytical Engine" by Federico Luigi, Conte Menabrea.[20]

1.4 Desktop calculators
By the 1900s, earlier mechanical calculators, cash registers, accounting machines, and so on were redesigned to use electric motors, with gear position as the representation for the state of a variable. The word "computer" was a job title assigned to people who used these calculators to perform mathematical calculations. By the 1920s Lewis Fry Richardson's interest in weather prediction led him to propose human computers and numerical analysis to model the weather; to this day, the most powerful computers on Earth are needed to adequately model its weather using the Navier-Stokes equations.[27]
Companies like Friden, Marchant Calculator and Monroe made desktop mechanical calculators from the 1930s that could add, subtract, multiply and divide. During the Manhattan project, future Nobel laureate Richard Feynman was the supervisor of the roomful of human computers, many of them female mathematicians, who understood the use of differential equations which were being solved for the war effort.
In 1948, the Curta was introduced. This was a small, portable, mechanical calculator that was about the size of a pepper grinder. Over time, during the 1950s and 1960s a variety of different brands of mechanical calculators appeared on the market. The first all-electronic desktop calculator was the British ANITA Mk.VII, which used a Nixie tube display and 177 subminiature thyratron tubes. In June 1963, Friden introduced the four-function EC-130. It had an all-transistor design, 13-digit capacity on a 5-inch (130 mm) CRT, and introduced Reverse Polish notation (RPN) to the calculator market at a price of $2200. The EC-132 model added square root and reciprocal functions. In 1965, Wang Laboratories produced the LOCI-2, a 10-digit transistorized desktop calculator that used a Nixie tube display and could compute logarithms.

1.5 Advanced analog computers
Before World War II, mechanical and electrical analog computers were considered the "state of the art", and many thought they were the future of computing. Analog computers take advantage of the strong similarities between the mathematics of small-scale properties—the position and motion of wheels or the voltage and current of electronic components—and the mathematics of other physical phenomena, for example, ballistic trajectories, inertia, resonance, energy transfer, momentum, and so forth. They model physical phenomena with electrical voltages and currents[28] as the analog quantities.
Centrally, these analog systems work by creating electrical analogs of other systems, allowing users to predict behavior of the systems of interest by observing the electrical analogs.[29] The most useful of the analogies was the way the small-scale behavior could be represented with integral and differential equations, and could be thus used to solve those equations. An ingenious example of such a machine, using water as the analog quantity, was the water integrator built in 1928; an electrical example is the Mallock machine built in 1941. A planimeter is a device which does integrals, using distance as the analog quantity. Unlike modern digital computers, analog computers are not very flexible, and need to be rewired manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems using behavioral analogues while the earliest attempts at digital computers were quite limited.
Some of the most widely deployed analog computers included devices for aiming weapons, such as the Norden bombsight[30] and the fire-control systems,[31] such as Arthur Pollen's Argo system for naval vessels. Some stayed in use for decades after WWII; the Mark I Fire Control Computer was deployed by the United States Navy on a variety of ships from destroyers to battleships. Other analog computers included the Heathkit EC-1, and the hydraulic MONIAC Computer which modeled econometric flows.[32]

1.6 Digital computation
The era of modern computing began with a flurry of development before and during World War II, as electronic circuit elements replaced mechanical equivalents, and digital calculations replaced analog calculations. Machines such as the Z3, the Atanasoff–Berry Computer, the Colossus computers, and the ENIAC were built by hand using circuits containing relays or valves (vacuum tubes), and often used punched cards or punched paper tape for input and as the main (non-volatile) storage medium. Defining a single point in the series as the "first computer" misses many subtleties (see the table "Defining characteristics of some early digital computers of the 1940s" below).
Alan Turing's 1936 paper[36] proved enormously influential in computing and computer science in two ways. Its main purpose was to prove that there were problems (namely the halting problem) that could not be solved by any sequential process. In doing so, Turing provided a definition of a universal computer which executes a program stored on tape. This construct came to be called a Turing machine.[37] Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.

1.7 ENIAC
The US-built ENIAC (Electronic Numerical Integrator and Computer) was the first electronic general-purpose computer. It combined, for the first time, the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. (Colossus couldn't add). It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, and contained over 18,000 valves. One of the major engineering feats was to minimize valve burnout, which was a common problem at that time. The machine was in almost constant use for the next ten years.
ENIAC was unambiguously a Turing-complete device. It could compute any problem (that would fit in memory). A "program" on the ENIAC, however, was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that evolved from it. Once a program was written, it had to be mechanically set into the machine. Six women did most of the programming of ENIAC. (Improvements completed in 1948 made it possible to execute stored programs set in function table memory, which made programming less a "one-off" effort, and more systematic).

1.8 First-generation machines
Even before the ENIAC was finished, Eckert and Mauchly recognized its limitations and started the design of a stored-program computer, EDVAC. John von Neumann was credited with a widely circulated report describing the EDVAC design in which both the programs and working data were stored in a single, unified store. This basic design, denoted the von Neumann architecture, would serve as the foundation for the worldwide development of ENIAC's successors.[51] In this generation of equipment, temporary or working storage was provided by acoustic delay lines, which used the propagation time of sound through a medium such as liquid mercury (or through a wire) to briefly store data. A series of acoustic pulses is sent along a tube; after a time, as the pulse reached the end of the tube, the circuitry detected whether the pulse represented a 1 or 0 and caused the oscillator to re-send the pulse. Others used Williams tubes, which use the ability of a small cathode-ray tube (CRT) to store and retrieve data as charged areas on the phosphor screen. By 1954, magnetic core memory[52] was rapidly displacing most other forms of temporary storage, and dominated the field through the mid-1970s.
The first universal programmable computer in the Soviet Union was created by a team of scientists under direction of Sergei Alekseyevich Lebedev from Kiev Institute of Electrotechnology, Soviet Union (now Ukraine). The computer MESM (МЭСМ, Small Electronic Calculating Machine) became operational in 1950. It had about 6,000 vacuum tubes and consumed 25 kW of power. It could perform approximately 3,000 operations per second. Another early machine was CSIRAC, an Australian design that ran its first test program in 1949. CSIRAC is the oldest computer still in existence and the first to have been used to play digital music.[55]

1.9 Commercial computers
The first commercial computer was the Ferranti Mark 1, which was delivered to the University of Manchester in February 1951. It was based on the Manchester Mark 1. The main improvements over the Manchester Mark 1 were in the size of the primary storage (using random access Williams tubes), secondary storage (using a magnetic drum), a faster multiplier, and additional instructions. The basic cycle time was 1.2 milliseconds, and a multiplication could be completed in about 2.16 milliseconds. The multiplier used almost a quarter of the machine's 4,050 vacuum tubes (valves).[56] A second machine was purchased by the University of Toronto, before the design was revised into the Mark 1 Star. At least seven of the these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam.[57]
In October 1947, the directors of J. Lyons & Company, a British catering company famous for its teashops but with strong interests in new office management techniques, decided to take an active role in promoting the commercial development of computers. The LEO I computer became operational in April 1951 [58] and ran the world's first regular routine office computer job. On 17 November 1951, the J. Lyons company began weekly operation of a bakery valuations job on the LEO (Lyons Electronic Office). This was the first business application to go live on a stored program computer.[59]
In June 1951, the UNIVAC I (Universal Automatic Computer) was delivered to the U.S. Census Bureau. Remington Rand eventually sold 46 machines at more than $1 million each ($8.2 million as of 2010).[60] UNIVAC was the first "mass produced" computer. It used 5,200 vacuum tubes and consumed 125 kW of power. Its primary storage was serial-access mercury delay lines capable of storing 1,000 words of 11 decimal digits plus sign (72-bit words). A key feature of the UNIVAC system was a newly invented type of metal magnetic tape, and a high-speed tape unit, for non-volatile storage. Magnetic media are still used in many computers.[61] In 1952, IBM publicly announced the IBM 701 Electronic Data Processing Machine, the first in its successful 700/7000 series and its first IBM mainframe computer. The IBM 704, introduced in 1954, used magnetic core memory, which became the standard for large machines. The first implemented high-level general purpose programming language, Fortran, was also being developed at IBM for the 704 during 1955 and 1956 and released in early 1957. (Konrad Zuse's 1945 design of the high-level language Plankalkül was not implemented at that time.) A volunteer user group, which exists to this day, was founded in 1955 to share their software and experiences with the IBM 701.
In 1955, Maurice Wilkes invented microprogramming,[64] which allows the base instruction set to be defined or extended by built-in programs (now called firmware or microcode).[65] It was widely used in the CPUs and floating-point units of mainframe and other computers, such as the IBM 360 series.[66]

1.10 Univac history and structure
Eckert and Mauchly built the ENIAC (Electronic Numerical Integrator and Computer) at the University of Pennsylvania's Moore School of Electrical Engineering between 1943 and 1946. A 1946 patent rights dispute with the university led Eckert and Mauchly to depart the Moore School to form the Electronic Control Company, later renamed Eckert-Mauchly Computer Corporation (EMCC), based in Philadelphia, Pennsylvania. That company first built a computer called BINAC (BINary Automatic Computer) for Northrop Aviation (which was little used, or perhaps not at all). Afterwards began the development of UNIVAC. UNIVAC was first intended for the Bureau of the Census, which paid for much of the development, and then was put in production.
With the death of EMCC's chairman and chief financial backer Harry L. Straus in a plane crash on October 25, 1949, EMCC was sold to typewriter maker Remington Rand on February 15, 1950. Eckert and Mauchly now reported to Leslie Groves, the retired army general who had managed the Manhattan Project. Remington Rand had its own calculating machine lab in Norwalk, Connecticut, and later bought Engineering Research Associates (ERA) in St. Paul, Minnesota. In 1953 or 1954 Remington Rand merged their Norwalk tabulating machine division, the ERA "scientific" computer division, and the UNIVAC "business" computer division into a single division under the UNIVAC name. This severely annoyed those who had been with ERA and with the Norwalk laboratory.
The most famous UNIVAC product was the UNIVAC I mainframe computer of 1951, which became known for predicting the outcome of the U.S. presidential election the following year. This incident is particularly infamous because the computer predicted an Eisenhower landslide when traditional pollsters all called it for Adlai Stevenson. The numbers were so skewed that CBS's news boss in New York, Mickelson, decided the computer was in error and refused to allow the prediction to be read. Instead they showed some staged theatrics that suggested the computer was not responsive, and announced it was predicting 8-7 odds for an Eisenhower win (the actual prediction was 100-1). When the predictions proved true and Eisenhower won a landslide within 1% of the initial prediction, Charles Collingwood, the on-air announcer, embarrassingly announced that they had covered up the earlier prediction.[1]
In 1955 Remington Rand merged with Sperry Corporation to become Sperry Rand. The UNIVAC division of Remington Rand was renamed the Univac division of Sperry Rand. General Douglas MacArthur was chosen to head the company. In the 1960s, UNIVAC was one of the eight major American computer companies in an industry then referred to as "Snow White and the seven dwarfs"—IBM, the largest, being Snow White and the others being the dwarfs: Burroughs, NCR, CDC, GE, RCA and Honeywell. In the 1970s, after GE sold its computer business to Honeywell and RCA sold its to Univac, the analogy to the seven dwarfs of legend became less apt and the remaining small firms became known as the "BUNCH" (Burroughs, Univac, NCR, Control Data, and Honeywell).
Around 1975, to assist "corporate identity" the name was changed to Sperry Univac, along with Sperry Remington, Sperry New Holland, etc. In 1978 Sperry Rand, an old fashioned conglomerate of disharmonious divisions (computers, typewriters, office furniture, hay balers, manure spreaders, gyroscopes, avionics, radar, electric razors), decided to concentrate on its computing interests and unrelated divisions were sold. The company dropped the Rand from its title and reverted back to Sperry Corporation. In 1986, Sperry Corporation merged with Burroughs Corporation to become Unisys.
Since the 1986 marriage of Burroughs and Sperry, Unisys has metamorphosed from a computer manufacturer to a computer services and outsourcing firm, competing in the same marketplace as IBM, Electronic Data Systems (EDS), and Computer Sciences Corporation. Unisys continues to design and manufacture enterprise class computers with the ClearPath and ES7000 server lines.

2.0 2nd Generation Of Computer

2.1 Transistors
The bipolar transistor was invented in 1947. From 1955 onwards transistors replaced vacuum tubes in computer designs,[68] giving rise to the "second generation" of computers. Initially the only devices available were germanium point-contact transistors, which although less reliable than the vacuum tubes they replaced had the advantage of consuming far less power.[69] The first transistorised computer was built at the University of Manchester and was operational by 1953;[70] a second version was completed there in April 1955. The later machine used 200 transistors and 1,300 solid-state diodes and had a power consumption of 150 watts. However, it still required valves to generate the clock waveforms at 125 kHz and to read and write on the magnetic drum memory, whereas the Harwell CADET operated without any valves by using a lower clock frequency, of 58 kHz when it became operational in February 1955.[71] Problems with the reliability of early batches of point contact and alloyed junction transistors meant that the machine's mean time between failures was about 90 minutes, but this improved once the more reliable bipolar junction transistors became available.[72]
Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. Transistors greatly reduced computers' size, initial cost, and operating cost. Typically, second-generation computers were composed of large numbers of printed circuit boards such as the IBM Standard Modular System[73] each carrying one to four logic gates or flip-flops.
A second generation computer, the IBM 1401, captured about one third of the world market. IBM installed more than one hundred thousand 1401s between 1960 and 1964.
Transistorized electronics improved not only the CPU (Central Processing Unit), but also the peripheral devices. The IBM 350 RAMAC was introduced in 1956 and was the world's first disk drive. The second generation disk data storage units were able to store tens of millions of letters and digits. Next to the fixed disk storage units, connected to the CPU via high-speed data transmission, were removable disk data storage units. A removable disk stack can be easily exchanged with another stack in a few seconds. Even if the removable disks' capacity is smaller than fixed disks,' their interchangeability guarantees a nearly unlimited quantity of data close at hand. Magnetic tape provided archival capability for this data, at a lower cost than disk.

3.0 3rd Generation Of Computer

3.1 Beyond
The explosion in the use of computers began with "third-generation" computers, making use of Jack St. Clair Kilby's[75] and Robert Noyce's[76] independent invention of the integrated circuit (or microchip), which later led to the invention of the microprocessor,[77] by Ted Hoff, Federico Faggin, and Stanley Mazor at Intel.[78] The integrated circuit in the image on the right, for example, an Intel 8742, is an 8-bit microcontroller that includes a CPU running at 12 MHz, 128 bytes of RAM, 2048 bytes of EPROM, and I/O in the same chip.
During the 1960s there was considerable overlap between second and third generation technologies.[79] IBM implemented its IBM Solid Logic Technology modules in hybrid circuits for the IBM System/360 in 1964. As late as 1975, Sperry Univac continued the manufacture of second-generation machines such as the UNIVAC 494. The Burroughs large systems such as the B5000 were stack machines, which allowed for simpler programming. These pushdown automatons were also implemented in minicomputers and microprocessors later, which influenced programming language design. Minicomputers served as low-cost computer centers for industry, business and universities.[80] It became possible to simulate analog circuits with the simulation program with integrated circuit emphasis, or SPICE (1971) on minicomputers, one of the programs for electronic design automation (EDA). The microprocessor led to the development of the microcomputer, small, low-cost computers that could be owned by individuals and small businesses. Microcomputers, the first of which appeared in the 1970s, became ubiquitous in the 1980s and beyond. Steve Wozniak, co-founder of Apple Computer, is sometimes erroneously credited with developing the first mass-market home computers. However, his first computer, the Apple I, came out some time after the MOS Technology KIM-1 and Altair 8800, and the first Apple computer with graphic and sound capabilities came out well after the Commodore PET. Computing has evolved with microcomputer architectures, with features added from their larger brethren, now dominant in most market segments.
Systems as complicated as computers require very high reliability. ENIAC remained on, in continuous operation from 1947 to 1955, for eight years before being shut down. Although a vacuum tube might fail, it would be replaced without bringing down the system. By the simple strategy of never shutting down ENIAC, the failures were dramatically reduced. Hot-pluggable hard disks, like the hot-pluggable vacuum tubes of yesteryear, continue the tradition of repair during continuous operation. Semiconductor memories routinely have no errors when they operate, although operating systems like Unix have employed memory tests on start-up to detect failing hardware. Today, the requirement of reliable performance is made even more stringent when server farms are the delivery platform.[81] Google has managed this by using fault-tolerant software to recover from hardware failures, and is even working on the concept of replacing entire server farms on-the-fly, during a service event.[82]
In the twenty-first century, multi-core CPUs became commercially available.[83] Content-addressable memory (CAM)[84] has become inexpensive enough to be used in networking, although no computer system has yet implemented hardware CAMs for use in programming languages. Currently, CAMs (or associative arrays) in software are programming-language-specific. Semiconductor memory cell arrays are very regular structures, and manufacturers prove their processes on them; this allows price reductions on memory products. When the CMOS field effect transistor-based logic gates supplanted bipolar transistors, computer power consumption could decrease dramatically (A CMOS field-effect transistor only draws significant current during the 'transition' between logic states, unlike the substantially higher (and continuous) bias current draw of a BJT). This has allowed computing to become a commodity which is now ubiquitous, embedded in many forms, from greeting cards and telephones to satellites. Computing hardware and its software have even become a metaphor for the operation of the universe.[85] Although DNA-based computing and quantum qubit computing are years or decades in the future, the infrastructure is being laid today, for example, with DNA origami on photolithography.[86]
An indication of the rapidity of development of this field can be inferred by the history of the seminal article.[87] By the time that anyone had time to write anything down, it was obsolete. After 1945, others read John von Neumann's First Draft of a Report on the EDVAC, and immediately started implementing their own systems. To this day, the pace of development has continued, worldwide.[88][89]

4.0 4th Generation Of Computer

4.1 Steve Jobs
Steven Paul "Steve" Jobs (born February 24, 1955) is an American businessman, and the co-founder and chief executive officer of Apple Inc. Jobs previously served as CEO of Pixar Animation Studios and is now a member of the Walt Disney Company's Board of Directors.
In the late 1970s, Jobs, with Apple co-founder Steve Wozniak, Mike Markkula[11] and others, designed, developed, and marketed some of the first commercially successful lines of personal computers, the Apple II series and later, the Macintosh. In the early 1980s, Jobs was among the first to see the commercial potential of the mouse-driven graphical user interface.[12][13] After losing a power struggle with the board of directors in 1985[14][15], Jobs resigned from Apple and founded NeXT, a computer platform development company specializing in the higher education and business markets. NeXT's subsequent 1997 buyout by Apple Computer Inc. brought Jobs back to the company he co-founded, and he has served as its CEO since then.
In 1986, he acquired the computer graphics division of Lucasfilm Ltd which was spun off as Pixar Animation Studios.[16] He remained CEO and majority shareholder until its acquisition by the Walt Disney Company in 2006.[2] Jobs is currently a member of Walt Disney Company's Board of Directors.[17][18]
Jobs' history in business has contributed much to the symbolic image of the idiosyncratic, individualistic Silicon Valley entrepreneur, emphasizing the importance of design and understanding the crucial role aesthetics play in public appeal. His work driving forward the development of products that are both functional and elegant has earned him a devoted following.[19]
In mid-January 2009, Jobs took a five-month leave of absence from Apple to undergo a liver transplant.[20]

4.2 Beginnings of Apple Computer
In 1976, Steve Jobs, Stephen Wozniak, Ronald Wayne[37] , and later with funding from then a semi-retired Intel product-marketing manager and engineer A.C. "Mike" Markkula Jr.[11], founded Apple. Prior to co-founding Apple, Wozniak was an electronics hacker. Jobs and Wozniak had been friends for several years, having met in 1971, when their mutual friend, Bill Fernandez, introduced 21-year-old Wozniak to 16-year-old Jobs. Steve Jobs managed to interest Wozniak in assembling a computer and selling it. As Apple continued to expand, the company began looking for an experienced executive to help manage its expansion. In 1983, Steve Jobs lured John Sculley away from Pepsi-Cola to serve as Apple's CEO, asking, "Do you want to spend the rest of your life selling sugared water to children, or do you want a chance to change the world?"[38][39] The following year, Apple set out to do just that, starting with a Super Bowl television commercial titled, "1984." At Apple's annual shareholders meeting on January 24, 1984, an emotional Jobs introduced the Macintosh to a wildly enthusiastic audience; Andy Hertzfeld described the scene as "pandemonium."[40] The Macintosh became the first commercially successful small computer with a graphical user interface. The development of the Mac was started by Jef Raskin, and eventually taken over by Jobs.
While Jobs was a persuasive and charismatic director for Apple, some of his employees from that time had described him as an erratic and temperamental manager. An industry-wide sales slump towards the end of 1984 caused a deterioration in Jobs's working relationship with Sculley, and at the end of May 1985 – following an internal power struggle and an announcement of significant layoffs – Sculley relieved Jobs of his duties as head of the Macintosh division.[41]

5.0 5th Generation Of Computer

5.1 Integrated circuit
Microchips (EPROM memory) with a transparent window, showing the integrated circuit inside. Note the fine silver-colored wires that connect the integrated circuit to the pins of the package. The window allows the memory contents of the chip to be erased, by exposure to strong ultraviolet light in an eraser device.
In electronics, an integrated circuit (also known as IC, microcircuit, microchip, silicon chip, or chip) is a miniaturized electronic circuit (consisting mainly of semiconductor devices, as well as passive components) that has been manufactured in the surface of a thin substrate of semiconductor material. Integrated circuits are used in almost all electronic equipment in use today and have revolutionized the world of electronics.
A hybrid integrated circuit is a miniaturized electronic circuit constructed of individual semiconductor devices, as well as passive components, bonded to a substrate or circuit board.

5.1.1 Introduction
Integrated circuits were made possible by experimental discoveries which showed that semiconductor devices could perform the functions of vacuum tubes, and by mid-20th-century technology advancements in semiconductor device fabrication. The integration of large numbers of tiny transistors into a small chip was an enormous improvement over the manual assembly of circuits using electronic components. The integrated circuit's mass production capability, reliability, and building-block approach to circuit design ensured the rapid adoption of standardized ICs in place of designs using discrete transistors.
There are two main advantages of ICs over discrete circuits: cost and performance. Cost is low because the chips, with all their components, are printed as a unit by photolithography and not constructed one transistor at a time. Furthermore, much less material is used to construct a circuit as a packaged IC die than as a discrete circuit. Performance is high since the components switch quickly and consume little power (compared to their discrete counterparts) because the components are small and close together. As of 2006, chip areas range from a few square millimeters to around 350 mm2, with up to 1 million transistors per mm2.

5.2 Mainframe
Mainframe may refer to one of the following:
• Mainframe computer, large data processing systems
• Mainframe Entertainment, a Canadian computer animation and design company.
• Mainframe is the city that the CGI cartoon ReBoot takes place in.
• Mainframe (band), a 1980s Electropop band
• Mainframe is the name of two fictional characters from Marvel Comics
o Mainframe (comics) appears in the series A-Next
o Mainframe appears in the Guardians of the Galaxy series
• Mainframe (Transformers) is an Autobot character in the Transformers series.
• A character in the G.I. Joe universe.
• A character from the game Gunman Chronicles.

evolution of communication

ASSIGNMENT 1

EVOLUTION OF COMMUNICATION

1.0 Speech

1.1 Evolution of the brain differentiated humans from animals, as among other things it allowed humans to master a very efficient form of communication - speech. A mutation of the FOXP2 gene, which occurred in homo sapiens about 200,000 years ago, was likely responsible for much of this change.
Speech greatly facilitated the transmission of information and knowledge to further generations. Experiences passed on through speech became increasingly rich, and allowed humans to adapt themselves to new environments - or adapt the environments to themselves - much more quickly than was possible before; in effect, biological human evolution was overtaken by technological progress and sociocultural evolution. Speech meant easier coordination and cooperation, technological progress and development of complex, abstract concepts such as religion or science. Speech placed humans at the top of the food chain, and facilitated human colonization of the entire planet.
Speech, however, is not perfect. The human voice carries only so far, and sign language is also rather limited in terms of distance. Further, all such forms of communications relied on human memory, another imperfect tool: memory can become corrupted or lost over time, and there is a limit to how much one can remember. With the accidental death of a 'wise man' or tribal elder, a pre-literate tribe could lose many generations of knowledge.

The imperfection of speech, which nonetheless allowed easier dissemination of ideas and stimulated inventions, eventually resulted in the creation of new forms of communications, improving both the range at which people could communicate and the longevity of the information. All of those inventions were based on the key concept of the symbol: a conventional representation of a concept.

1.2 The next step in the history of communications is petroglyphs, carvings into a rock surface. It took about 20,000 years for homo sapiens to move from the first cave paintings to the first petroglyphs, which are dated to around 10,000 BC.
It is possible that the humans of that time used some other forms of communication, often for mnemonic purposes - specially arranged stones, symbols carved in wood or earth, quipu-like ropes, tattoos, but little other than the most durable carved stones has survived to modern times and we can only speculate about their existence based on our observation of still existing 'hunter-gatherer' cultures such as those of Africa or Oceania.[4]

1.3 A pictogram (pictograph) is a symbol representing a concept, object, activity, place or event by illustration. Pictography is a form of proto-writing whereby ideas are transmitted through drawing. Pictographs were the next step in the evolution of communication: the most important difference between petroglyphs and pictograms is that petroglyphs are simply showing an event, but pictograms are telling a story about the event, thus they can for example be ordered in chronological order.
Pictograms were used by various ancient cultures all over the world since around 9000 BC, when tokens marked with simple pictures began to be used to label basic farm produce, and become increasingly popular around 6000-5000 BC.
Pictograms, in turn, evolved into ideograms, graphical symbols that represent an idea. Their ancestors, the pictograms, could represent only something resembling their form: therefore a pictogram of a circle could represent a sun, but not concepts like 'heat', 'light', 'day' or 'Great God of the Sun'. Ideograms, on the other hand, could convey more abstract concepts, so that for example an ideogram of two sticks can mean not only 'legs' but also a verb 'to walk'.
Because some ideas are universal, many different cultures developed similar ideograms. For example an eye with a tear means 'sadness' in Native American ideograms in California, as it does for the Aztecs, the early Chinese and the Egyptians.
Ideograms were precursors of logographic writing systems such as Egyptian hieroglyphs and Chinese characters.
Examples of ideographical proto-writing systems, thought not to contain language-specific information, include the Vinca script (see also Tărtăria tablets) and the early Indus script. In both cases there are claims of decipherment of linguistic content, without wide acceptance.
The oldest-known forms of writing were primarily logographic in nature, based on pictographic and ideographic elements. Most writing systems can be broadly divided into three categories: logographic, syllabic and alphabetic (or segmental); however, all three may be found in any given writing system in varying proportions, often making it difficult to categorise a system uniquely.
The invention of the first writing systems is roughly contemporary with the beginning of the Bronze Age in the late Neolithic of the late 4th millennium BC. The first writing system is generally believed to have been invented in pre-historic Sumer and developed by the late 3rd millennium into cuneiform. Egyptian hieroglyphs, and the undeciphered Proto-Elamite writing system and Indus Valley script also date to this era, though a few scholars have questioned the Indus Valley script's status as a writing system.
The original Sumerian writing system was derived from a system of clay tokens used to represent commodities. By the end of the 4th millennium BC, this had evolved into a method of keeping accounts, using a round-shaped stylus impressed into soft clay at different angles for recording numbers. This was gradually augmented with pictographic writing using a sharp stylus to indicate what was being counted. Round-stylus and sharp-stylus writing was gradually replaced about 2700-2000 BC by writing using a wedge-shaped stylus (hence the term cuneiform), at first only for logograms, but developed to include phonetic elements by the 2800 BC. About 2600 BC cuneiform began to represent syllables of spoken Sumerian language. Finally, cuneiform writing became a general purpose writing system for logograms, syllables, and numbers. By the 26th century BC, this script had been adapted to another Mesopotamian language, Akkadian, and from there to others such as Hurrian, and Hittite. Scripts similar in appearance to this writing system include those for Ugaritic and Old Persian.
The Chinese script may have originated independently of the Middle Eastern scripts, around the 16th century BC (early Shang Dynasty), out of a late neolithic Chinese system of proto-writing dating back to c. 6000 BC. The pre-Columbian writing systems of the Americas (including among others Olmec and Mayan) are also generally believed to have had independent origins, although some experts have noticed similarities between Olmec writing and Shang writing that seem to suggest that Mesoamerican writing was imported from China.

1.4 The first pure alphabets (properly, "abjads", mapping single symbols to single phonemes, but not necessarily each phoneme to a symbol) emerged around 2000 BC in Ancient Egypt, but by then alphabetic principles had already been incorporated into Egyptian hieroglyphs for a millennium (see Middle Bronze Age alphabets).
By 2700 BC Egyptian writing had a set of some 22 hieroglyphs to represent syllables that begin with a single consonant of their language, plus a vowel (or no vowel) to be supplied by the native speaker. These glyphs were used as pronunciation guides for logograms, to write grammatical inflections, and, later, to transcribe loan words and foreign names.
However, although seemingly alphabetic in nature, the original Egyptian uniliterals were not a system and were never used by themselves to encode Egyptian speech. In the Middle Bronze Age an apparently "alphabetic" system is thought by some to have been developed in central Egypt around 1700 BC for or by Semitic workers, but we cannot read these early writings and their exact nature remain open to interpretation.
Over the next five centuries this Semitic "alphabet" (really a syllabary like Phoenician writing) seems to have spread north. All subsequent alphabets around the world with the sole exception of Korean Hangul have either descended from it, or been inspired by one of its descendants.

2.0 History of the telephone

2.1 There is probably no single inventor of the telephone, although Alexander Graham Bell has certainly been widely credited with its invention. Like many other inventions such as the television, its creation was a culmination of other technologies. One key technology was the telegraph.
The telephone (tele = distance, phone = sound/voice)
It has been argued that the telephone was originally invented by an Italian inventor named Antonio Meucci. He demonstrated the principles of telephony in 1849 by sending electrical impulses through a wire to communicate sounds.
In 1860 Meucci demonstrated his telephone publicly in New York. However, partially due to his poor English and lack of business experience, he was not successful in commercialising his invention.
Similar devices were also invented and demonstrated by Johann Philipp Reis from Germany in 1860, and Innocenzo Manzetti in Italy in 1864. Others between 1840 and 1875 also wrote papers and took out patents on various telephone-like devices.
However, it wasn't until Alexander Graham Bell invented and demonstrated his telephone in 1875 that the invention took off. In fact for many years, Bell was acknowledged as the inventor of the telephone and legal cases by Meucci and others were unsuccessful in gaining recognition for their achievements. This changed in 2002 when the US Congress formally recognised Meucci as the inventor of the telephone, 106 years after his death.
Bell's first telephone was one-way. Using two receivers, a wire and a battery, he was able to convey voice between rooms in June 1875. His first two-directional phone call was made in March 1876, and the first 'long-distance' call was made in August of the same year when Bell called from his family homestead in Ohio to his assistant, Mr Watson, in a town 16 kilometres away.
Bell patented his telephone invention in February 1876, and only just in time. His competitor, Elisha Gray, missed out on owning the patent on the telephone by two hours!

2.2 The first phones did not support the use of phone numbers to call people, so as more were sold it became necessary to establish manual switchboards where a human operator could connect people. The first was opened in 1878 and could handle two simultaneous conversations.
By the start of the new century, the first voice transmission was made across the Atlantic Ocean via radio, opening the possibility of making telephone calls to places around the world. By 1935, that too became a reality.
The telephone and related technologies continued improving in quality and distance. Touch-tone dialling, using keypads like those we see on phones today, was introduced in 1941. A national numbering plan with area codes was introduced in America in 1946 and Caller ID, which allows the person receiving the call to see who made it, was patented in1982.
Telephony today has gone beyond the handset and touch-tone keypad. We now have wireless phone systems that have given rise to the semi-cordless and the cordless telephone, the satellite phone and the .
In the twenty-first century, we have managed to combine the Internet with the telephone; Voice over IP, or internet telephony, looks set to replace traditional telephone networks with its low cost and ease of use. Like the telegraph, the telephone may one day drop off the world's technology scale but its history has been instrumental in the development of modern communications.

2.2.1 Alexander Graham Bell's first attempt at a speaking telephone, known as a Gallows Receiver because if you turned it on its side, it eerily resembles a hangman's scaffold (June 1875)

2.2.2 The Butterstamp telephone combined the receiver and transmitter into one handheld unit. To use it, you had to use a crank to signal the operator, and then talk into one end, turn the instrument around and listen to the other end. Needless to say, it confused a lot of people! (1878)

2.2.3 A Strowger dial telephone, built by Auto Electric Company. They manufactured the first commercial dial telephone; a devise that would route calls directly to individuals using an automated switchboard without needing to speak to a human operator. (1891)

2.2.4 Videophones were first used in France and Germany in the 1930s. They were not popular as the phones were cumbersome and expensive. However, videophones picked up in popularity in the 1990s, giving rise to videoconferencing as a business practice.

2.2.5 AT & T Touch-Tone Telephone. Early Touch-Tone sets had only 10 buttons. AT & T added the * and # keys in 1968 for use in advanced services. (1963)

2.2.6 Cordless handsets were first developed by Teri Pall in 1965. The base unit of the phone can be connected to the land-line system while the handset functions remotely through low power radio.

2.2.7 DynaTAC 8000X, the world's first mobile phone which was created by Motorola in 1983.

2.2.8 VocalTec Internet Phone, the first Voice over Internet Protocol (VOIP) application to be released. (February 1995)

3.0 Radio waves

3.1 Radio waves travel (propagate) through the air and the vacuum of space equally well, not requiring a medium of transport.
A radio wave is created whenever a charged object accelerates with a frequency that lies in the radio frequency (RF) portion of the electromagnetic spectrum. By contrast, other types of emissions which fall outside the RF range are gamma rays, X-rays, infrared & ultraviolet light, and light visible to humans.
When a radio wave passes a wire, it induces a moving electric charge (voltage) that can be transformed into audio or other signals that carry information. Although the word 'radio' is used to describe this phenomenon, the transmissions which we know as television, radio, radar, and cell phone are all in the class of radio frequency emissions.

3.2 Discovery
The theoretical basis of the propagation of electromagnetic waves was first described in 1873 by James Clerk Maxwell in his paper to the Royal Society A dynamical theory of the electromagnetic field, which followed his work between 1861 and 1865.
It was Heinrich Rudolf Hertz who, between 1886 and 1888, first validated Maxwell's theory through experiment, demonstrating that radio radiation had all the properties of waves (now called Hertzian waves), and discovering that the electromagnetic equations could be reformulated into a partial differential equation called the wave equation.

4.0 Signal lamp

4.1 A signal lamp (also called an Aldis lamp) is a visual signaling device for optical communication (typically using Morse code) – essentially a focused lamp which can produce a pulse of light. It is named after its inventor Authur C W Aldis. This pulse is achieved by opening and closing shutters mounted in front of the lamp, either via a manually-operated pressure switch or, in later versions, automatically. The lamps were usually equipped with some form of optical sight, and were most commonly used on naval vessels and in airport control towers (using color signals for stop or clearance).

5.0 Television sets

5.1 In television's electromechanical era, commercially made television sets were sold from 1928 to 1934 in the United Kingdom, United States, and Russia. The earliest commercially made sets sold by Baird in the UK in 1928 were radios with the addition of a television device consisting of a neon tube behind a mechanically spinning disk (the Nipkow disk) with a spiral of apertures that produced a red postage-stamp size image, enlarged to twice that size by a magnifying glass. The Baird "Televisor" was also available without the radio. The Televisor sold in 1930–1933 is considered the first mass-produced set, selling about a thousand units.
The first commercially made electronic television sets with cathode ray tubes were manufactured by Telefunken in Germany in 1934, followed by other makers in France (1936), Britain (1936), and America (1938). The cheapest of the pre-World War II factory-made American sets, a 1938 image-only model with a 3-inch (8 cm) screen, cost US$125, the equivalent of US$1,863 in 2007. The cheapest model with a 12-inch (30 cm) screen was $445 ($6,633).
An estimated 19,000 electronic television sets were manufactured in Britain, and about 1,600 in Germany, before World War II. About 7,000–8,000 electronic sets were made in the U.S. before the War Production Board halted manufacture in April 1942, production resuming in August 1945.
Television usage in the United States skyrocketed after World War II with the lifting of the manufacturing freeze, war-related technological advances, the gradual expansion of the television networks westward, the drop in set prices caused by mass production, increased leisure time, and additional disposable income. In 1947, Motorola introduced the VT-71 television for $189.95, the first television set to be sold for under $200, finally making television affordable for millions of Americans. While only 0.5% of U.S. households had a television set in 1946, 55.7% had one in 1954, and 90% by 1962. In Britain, there were 15,000 television households in 1947, 1.4 million in 1952, and 15.1 million by 1968.
For many years different countries used different technical standards. France initially adopted the German 441-line standard but later upgraded to 819 lines, which gave the highest picture definition of any analogue TV system, approximately double the resolution of the British 405-line system. However this is not without a cost, in that the cameras need to produce four times the pixel rate (thus quadrupling the bandwidth), from pixels one-quarter the size, reducing the sensitivity by an equal amount. In practice the 819-line cameras never achieved anything like the resolution that could theoretically be transmitted by the 819 line system, and for color, France reverted to the same 625 lines as the European CCIR system.
Eventually most of Europe switched to the 625-line PAL standard, once more following Germany's example, with France adopting SECAM. Meanwhile in North America the original NTSC 525-line standard from 1941 was retained, although analog television broadcasting in the United States ended on June 12, 2009 in favor of digital-only broadcasting.

6.0 Photophones

6.1 The photophone, also known as a radiophone, was invented jointly by Alexander Graham Bell and his then-assistant Charles Sumner Tainter on February 19, 1880, at Bell's 1325 'L' Street laboratory in Washington, D.C. Both were later to become full associates in the Volta Laboratory Association, created and financed by Bell.
Bell believed the photophone was his most important invention. The device allowed for the transmission of sound on a beam of light. On April 1, 1880, and also described by plaque as occurring on June 3, Bell transmitted the world's first wireless telephone message on his newly invented form of telecommunication, the far advanced precursor to fiber-optic communications. The wireless call was sent from the Franklin School to the window of Bell's laboratory, some 213 meters away.

7.0 Samaphore lines

7.1 A semaphore telegraph, optical telegraph, shutter telegraph chain, Chappe telegraph, or Napoleonic semaphore is a system of conveying information by means of visual signals, using towers with pivoting shutters, also known as blades or paddles. Information is encoded by the position of the mechanical elements; it is read when the shutter is in a fixed position. These systems were popular in the late 18th - early 19th century. In modern usage, "semaphore line" and "optical telegraph" may refer to a relay system using flag semaphore.
Semaphore lines were a precursor of the electrical telegraph. They were far faster than post riders for bringing a message over long distances, but far more expensive and less private than the electrical telegraph lines which would replace them. The distance that an optical telegraph can bridge is limited by geography and weather, thus in practical use, most optical telegraphs used lines of relay stations to bridge longer distances.