The development of computer hardware processor or other hardware, development of the CPU, Computer Development, Computer Grouping
Hardware (Hardware)
Computer hardware (hardware) is all part of the physical computer, and be with the data residing in or who operate in it, and be with the software (software) that provides instructions for the hardware in completing its tasks. In Hardware (Hardware) Computers, hardware there are three concepts, namely:
* Input Devices (Input Devices)
Input device used to enter data, either in the form of text, images, or images into a computer, for example, namely:
*
1. Mouse, is a tool to give commands in the processing of data or edit data.
2. Scanner, a tool to enter data such as pictures or charts and turning them into digital form so it can be processed and merged with the form data in the form of text.
3. Joystick
4. Digital Camera
5. Microphone
6. Digitizer
7. Touch Screen
8. Touch pad
9. Track ball
10. Light pen
11. Handycam
12. Keyboard
* Data processing device used to process the data. Data processor includes a central processing unit (CPU / Central Processing Unit) and also microprocessors.
* Device Output (Output Devices)
Output device used to collect and produce data that excluded, for example:
*
1. Monitor, a tool that can display text or images from the data that is being processed in the CPU.
2. Printer, a device that produces the output data (output) print the form, the form of text and graphics.
3. Speaker, is a tool that produces the output data (output) form of sound.
4. Projector
CPU development
Processor is a very important part of a computer, which functions as the brains of computers and without a computer processor is just a dumb machine that can not be anything. To achieve speeds up to now, the processor is experiencing growth. The development of next generation processors from the 4004 microprocessor used in calculating machines Busicom up to Quad-core Intel Xeon processors.
Developments initiated by the processor in Intel's processors at the time that the only existing microprocessor, but at this point there are plenty of processors from other manufacturers, so the user can find a variety of processors.
1. Microprocessor 4004 (1971)
Processor in begun in 1971 when Intel released its first processor is in use on buscom totalizer. This is a discovery that started to enter into the machine intelligent systems. The processor is called a microprocessor 4004. Intel's 4004 chip was the CPU to initiate the development of pioneering the laying of all components of calculating machines in a single IC. At this time the IC working on one task only.
2. Microprocessor 8008 (1972)
In 1972, Intel released the 8008 microprocessor speed count two times from the previous MP. Mp MP is the first 8 bits. Mp is also designed to work on one job alone.
3. Microprocessor 8080 (1974)
In 1974 Intel's latest re-issue series with mp 8080. In this series intel mp multivoltage make changes from a triple voltage, NMOS technology is in use, faster than the previous series using PMOS technology. Mp is a first for a computer brain named altair.Pada currently addressing up to 64 kilobytes of memory already. Kecepatanya until the previous mp 10X.
This year also appeared mp from other manufacturers such as Motorola's MC6800 -1974, -1976 Zilog Z80 from (the two rivals by weight), and other prosessor2 6500 series of artificial Most, Rockwell, Hyundai, WDC, NCR and beyond.
4. Microprocessor 8086 (1978)
Processor 8086 is the first 16-bit cpu, but at this point there are still many in use mainboard sandard 8 bits, because 16bit motherboards are expensive. Finally, in 1979 Intel to redesign the processor is so compatible with the mainboard eight bits in the given name of 8088 but could logically call 8086sx. Computer firm IBM uses for computer processors 8086sx this because it's cheaper than 8086 prices, and also can use the mainboard processor traces of 8080. The technology used on this processor is also different from the 8080 series, where the 8086 series and Intel 8086sx using HMOS technology.
5. Microprocessor, 286 (1982)
Intel 286 or better known by the name of 80 286 is a first processor that can recognize and use the software that is used for the previous processor. 286 (1982) is also a 16 bit.Prosessor processor has a relatively big progress compared to chip-chip clock generation pertama.Frekuensi improved, but the main improvement is to optimize the handling perintah.286 produce more work per clock tick than 8088/8086.
At the initial velocity (6 MHz) rallied four times better job than 8086 at 4.77 MHz. Recently introduced with a clock speed of 8.10, and 12 MHz are used on an IBM PC-AT (1984).
Another reform is the ability to work in protected mode / mode protection - a new working mode with a "24-bit virtual address mode" / 24-bit virtual addressing mode, which confirms the shift from DOS to the Windows and multitasking. But you can not switch from protected mode back to real / real mode without booting their PCs, and operating system that uses this is just the OS / 2 that time.
* Transistor shaped like a tube of very small, there are on Chip.
* Micron is the size in Micron (10 ^ -6), is the smallest wires in chips
* Clock Speed = maximum speed of a processor
Data width = width of arithmetic logic unit (ALU) / Unit manager of arithmetic, for the process of subtraction, division, multiplication, and so forth.
Computer development
First Generation
With the onset of the Second World War, the countries involved in the war sought to develop computers to exploit their potential strategic computer. This increased funding for computer development projects hastened technical progress. In 1941, Konrad Zuse, a German engineer to build a computer, the Z3, to design airplanes and missiles.
Party allies also made other progress in the development of computing power. In 1943, the British completed the secret code-breaking computer called Colossus to decode the secrets used by Germany. The Colossus's impact influenced the development of the computer industry because of two reasons. First, Colossus was not a versatile computer (general-purpose computer), it was only designed to decode secret messages. Secondly, the existence of the machine was kept secret until decades after the war ended.
The work done by the Americans at that time produced an other progress. Howard H. Aiken (1900-1973), a Harvard engineer working with IBM, succeeded in producing electronic calculators for the U.S. Navy. The calculator is a length of half a football field and has a range of 500 miles along the cable. The Harvd-IBM Automatic Sequence Controlled Calculator, or Mark I, an electronic relay computer. He uses electromagnetic signals to move the mechanical components. Beropreasi machine is slow (it takes 3-5 seconds per calculation) and inflexible (in order of calculations can not be changed.) The calculator can perform basic arithmetic and more complex equations.
Another computer development at present is the Electronic Numerical Integrator and Computer (ENIAC), which was created by the cooperation between the United States government and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the computer was such a huge machine that consume power equal to 160kW.
This computer was designed by John Presper Eckert (1919-1995) and John W. Mauchly (1907-1980), ENIAC is a versatile computer (general-purpose computers) that work 1000 times faster than Mark I.
In the mid 1940s, John von Neumann (1903-1957) joined the team of University of Pennsylvania computer desin build concept that the next 40 years is still used in computer engineering. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with sebuh memory to accommodate both programs or data. This technique allows the computer to stop at some point and then resume her job back. The key factor of the von Neumann architecture is the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. In 1951, UNIVAC I (Universal Automatic Computer I) made by Remington Rand, became the first commercial computer that uses the von Neumann architecture model. Both the United States Census Bureau and General Electric have a UNIVAC. One of the impressive results achieved by the UNIVAC dalah success in predicting victory Dwilight D. Eisenhower in the 1952 presidential election.
First generation computers were characterized by the fact that operating instructions were made specifically for a particular task. Each computer has a program different binary-coded-called "machine language" (machine language). This causes the computer is difficult to be programmed and the speed limit. Another feature is the use of first generation computer vacuum tube (which makes the computer at that time are very large) nd the magnetic cylinder for data storage.
Second Generation
In 1948, the invention of the transistor greatly influenced the development of a computer. Transistors replaced vacuum tubes in television, radio, and computers. As a result, the size of electronic machinery has been reduced drastically.
The transistor used in computers began in 1956. In other findings in the form of development-magnetic core memory to help the development of second generation computers smaller, faster, more reliable, and more energy efficient than its predecessor. The first machine that utilizes this new technology is a supercomputer. IBM makes supercomputer named Stretch, and Sprery-Rand makes a computer named LARC. This Komputerkomputer, developed for atomic energy laboratories, could handle large amounts of data, a capability that is needed by researchers atoms. The machine was very expensive and tend to be too complex for business computing needs, thereby limiting its popularity. There are only two LARC has ever installed and used: one at the Lawrence Radiation Labs in Livermore, California, and others in the U.S. Navy Research and Development Center in Washington DC The second-generation computers replaced machine language with assembly language. Assembly language is a language that uses abbreviations-singakatan to replace the binary code.
In the early 1960s, began to appear successful second generation computers in business, in universities and in government. The second generation of computers is an entirely computer using transistors. They also have components that can be associated with the computer at this time: a printer, storage on disks, memory, operating systems and programs. One important example on the computer was the IBM 1401 is widely accepted in the industry. In 1965, almost all large businesses use computers to process the second generation of financial information.
Programs were stored on computer and programming language in it gives flexibility to the computer. Flexibility is increased performance at a reasonable price for business use. With this concept, the computer can print invoices and then run the consumer purchases the product design or calculate payroll. Some programming languages began to appear at that time. Programming language Common Business-Oriented Language (COBOL) and FORTRAN (Formula Translator) came into common use. This programming language replaces complicated machine code with words, sentences, and math formulas more easily understood by humans. This facilitates a person to program and manage the computer. Various New types of careers (programmer, analyst, and computer systems expert). Software industry also began to emerge and evolve during this second-generation computers.
Third Generation
Although the transistors in many respects the vacuum tube, but transistors generate substantial heat, which could potentially damage the internal parts of a computer. Quartz stone (quartz rock) eliminates this problem. Jack Kilby, an engineer at Texas Instruments, developed the integrated circuit (IC: integrated circuit) in 1958. IC combined three electronic components in a small silicon disc made of quartz sand. In Scientists later managed to include more components into a single chip called a semiconductor. Result, computers became ever smaller as more components can be squeezed onto the chip. Other third-generation development is the use of operating system (operating system) which allows the engine to run many different programs at once with a central program that monitored and coordinated the computer's memory ....
Fourth Generation
After IC, the only place to go was down the size of circuits and electrical components. Large Scale Integration (LSI) could fit hundreds of components on a chip. In the 1980s, Very Large Scale Integration (VLSI) contains thousands of components in a single chip.
Ultra-Large Scale Integration (ULSI) increased that number into the millions. Ability to install so many components in a chip that berukurang half coins encourage lower prices and the size of a computer. It also increased their power, efficiency and reliability. Intel chips are made in the year 4004 brought progress in IC 1971 by putting all the components of a computer (central processing unit, memory, and control input / output) in a chip is very small. Previously, the IC is made to do a certain task specific. Now, a microprocessor can be manufactured and then programmed to meet all the requirements. Not long after, everyday household items like microwave ovens, television, nd car with electronic fuel injection equipped with microprocessors.
Such developments allow ordinary people to use a regular computer. Computers no longer be a dominant big companies or government agencies. In the mid-1970s, computer assemblers to offer their computer products to the general public. These computers, called minikomputer, sold with a software package that is easy to use by the layman. The most popular software at the time was word processing and spreadsheet programs. In the early 1980s, such as the Atari 2600 video game consumer interest for more sophisticated home computer and can be programmed.
In 1981, IBM introduced the use of Personal Computer (PC) for use in homes, offices and schools. The number of PCs that use jumped from 2 million units in 1981 to 5.5 million units in 1982. Ten years later, 65 million PCs in use. Computers continue its evolution toward a smaller size, of computers that are on the table (desktop computers) into a computer that can be inserted into the bag (laptop), or even a computer that can be grasped (palmtops).
IBM PC to compete with Apple Macintosh computers in the fight over the market. Apple Macintosh became famous for popularizing the graphical system on his computer, while his rival was still using a text-based computer. Macintosh also popularized the use of mouse devices. At the present time, we know the journey with the use of IBM compatible CPU: IBM PC/486, Pentium, Pentium II, Pentium III, Pentium IV (series of CPUs made by Intel). Also we know AMD K6, Athlon, etc.. This is all included in the fourth generation of computer classes.
Along with the proliferation of computer usage in the workplace, new ways to explore the potential to be developed. Along with the increased strength of a small computer, komputerkomputer can be linked together in a network to share a memory, software, information, and also to be able to communicate with each other. Computer networks allow computers to form a single electronic cooperation to complete a process task. By using direct cabling (also called local area network, LAN), or telephone cable, the network can become very large.
Fifth Generation
Defining a fifth-generation computer becomes quite difficult because this stage is still very young. Example is the fifth generation computer imaginative fictional HAL9000 computer from the novel by Arthur C. Clarke titled 2001: Space Odyssey. HAL displays all the desired functions from a fifth-generation computer. With artificial intelligence (artificial intelligence), the HAL may have enough reason to do percapakan with humans, using visual feedback, and learning from his own experience.
Although it may be the realization of HAL9000 still far from reality, many of the functions that had been established. Some computers can receive verbal instructions and be able to mimic human reasoning. The ability to translate a foreign language also becomes possible. This facility is deceptively simple. However, such facilities become much more complicated than expected when programmers realized that human pengertia highly dependent on the context and meaning rather than simply translate the words directly.
Many advances in the field of computers and technology semkain design enables the creation of the fifth generation computer. Two engineering advances which are mainly parallel processing capabilities, which will replace the non-Neumann model. Non Neumann model will be replaced with a system that is able to coordinate many CPUs to work in unison. Another advancement is the superconducting technology that allows the flow of electrically without any obstacles, which will accelerate the speed of information.
Japan is a country well known in the jargon of socialization and the fifth generation computer project. Institution ICOT (Institute for New Computer Technology) was also formed to make it happen. Many news stating that this project has failed, but some other information that the success of this fifth generation computer project will bring new changes in the world of computerized paradigm. We are waiting for which information is more valid and bore fruit.
The contents of CPU
Central Processing Unit if translated in the Indonesian language means the Central Processing Unit. Inside the CPU there are various kinds of hardware required to run a computer. Now to know what the hardware usually found on the CPU, please read this article next.
Hard Drive
Hard disk drive (HDD) is the place to keep data on the CPU. If the disk is opened, then the visible disk of metal in it as a place to write data. Rotation speed varies. There is 5400 revolutions per minute even have up to 7200 revolutions per minute. The ability of a hard drive is usually determined by the amount of data that can be stored. The amount varies, there is a 1.2 Gigabyte (GB) to 80 GB. One GB is 1000 Megabytes, while a Megabyte is 1000 Kilobytes. Very big is it? We can store all our data on this disk.
A. Definitions Hardisk
Hardware hard drive is a secondary storage area where data is stored as magnetic pulses on a rotating metal disc that terintegrasi.Hardisk also be called on Permanent memory because it can retain data despite power at the computer was dead, as opposed to RAM which can not be permanently store data for if the electricity died, his data can not disimpan.Hardisk can be Input or Output. Form Input if the hard drive hard drive to transfer data to other hard disk or to flash. Hard drive can also be output if there is data that is sent to the hard drive.
B. History Hardisk
Hard drive at the beginning of its development is dominated by giant companies that became the standard computer that is IBM. Appeared in the year following years other companies such as Seagate, Quantum, Conner to Hewlett Packard's in 1992. At first the technology used to read / write, the head read / writes and penyimpannya metal disc touching. But nowadays, this is avoided, due to the current disk rotation speed is high, touch the metal disk storage instead of physical damage from the disk. The following brief history:
*
1. Punched card
Data storage is the oldest known punch cards, created in 1725 by Basile Bouchon. At that time its use is to store data woven fabric patterns by punching holes in paper rolls.
2. Selectron Tube In 1946 RCA began making Selectron Tube which is the first bentukmemori-based computer with a length of about 30 Kb cmdengan 4 capacity, this memory does not last long in the market because the price is too expensive.
3.
Magnetic tape data storage media which is usually used for mini-or mainframe-type computers.
4.
Floppy Disks floppy disk in 1969 was first introduced with the size 20 cmmampu accommodate 80 Kb of data but only for a single-use, four tahunkemudian of the same size, improved its ability to be 256 Kb, and can be used repeatedly. Year after year ukurandisket smaller and greater ability to store data pula.Namun At present, diskete already started seldom used because of the emergence of "hard disk"
5.
On 13 September 1956 Hardisk IBM Computer introduces the newest model of the IBM 305 RAMAC, at the time was a revolution because the IBM 305 RAMAC is accompanied by the world's first hard disk with a capacity of 4.4 MB.Hardisknya outside iasa itself consists of 50 pieces of size 60 cm disc . IBM renting this computer for Rp. 30 million monthly. Hard drive is still used until now with the size and capacity lebihkecil which of course is much greater.
C. Capacity
Hard drive capacity at this time have reached the order of hundreds of GB and even TB (Terra Byte). This is because the technology is getting better materials, higher data density. Technology from Western Digital is now able to make a 200GB hard drive with 7200RPM speed. While Maxtor Maxtor MaxLine with his second of size 300GB hard disk with 5400RPM speed. In conjunction with the transition to a smaller hard disk size and the larger capacity dramatic decline in the price per megabyte of storage, large capacity hard drives make the price achieved by ordinary computer users.
Floppy Disk Drive
Floppy disk drive is a device to read or write on a diskette. Several years ago, many people still use floppy disks measuring 5 1 / 4 inch (large disks), which stores the data of 700 Kilobytes. Currently a large floppy diskette has been replaced with small-sized (3 1 / 2 inch) with a capacity of 1.4 Megabytes of data store.
How it works almost the same as floppy disk drives. Circular plate in the floppy disk containing the data will be rotated by the motor in the floppy disk drive. A magnet will read or write data on the diskette.
CD-ROM drive
Its function is to read data from a Compact Disc (CD). ROM is an acronym for Read Only Memory which means that data storage can only be read. So the CD-ROM can only be used to read data, can not be used to store data. But this time, there is a similar tool that can be used to write / save data to a CD. His name CD-RW (CD Read and Write or CD read and write). How the CD-ROM or CD-RW drives work the same way or floppy disk drives. The difference is, the piece played is a CD. Instrument readers but also not the magnetic head in lieu of a small laser beam.
Processors
The processor functions to process all the calculations that must be done by computer. Measured by the frequency of processing power, such as 550 MHz (Mega Hertz) to date someone have reached 1.4 GHz (Giga Hertz).
If the computer is turned on, then the processor will be working directly and rapidly rising temperature. Therefore each processor is now equipped with a hot iron dealer (heat sinks) and the cooling fan. Currently the widely used processors are Intel, AMD and IBM.
Memory
Memory also known as RAM (Random Access Memory). Use is for temporary data storage when used by the processor. If the computer power off, then the data in RAM is lost. RAM data read speed is faster when compared with the hard drive.
Graphics Card (VGA Card)
VGA card (Video Graphics Adapter) allows you to translate the output (output) to a computer monitor. For drawing / graphic design or to play games, we need a VGA high strength. Currently there are VGA to memory 16, 32 to 128 Megabytes. Type, which made the company famous is Nvidia GeForce.
Sound Cards (Soundcard)
This device allows you to make a sound. If we're listening to music or playing games, this device is very useful. Her voice can be stereo, surround (spin) and even three-dimensional sound, so we seemed to be in the incident. But this device is less than complete if there were no speakers. Therefore we need to connect speakers with soundcards that have been installed by a cable plugged directly into the soundcard. games
Motherboards
Motherboard or collectively, the Parent Board serves to place all major appliance CPU mentioned above. Motherboard form as an electronic circuit boards.
The motherboard is the place passed lalangnya data. The motherboard connects all the computer equipment and make them work together so that the computer running smoothly.
Grouping Computers
1. Here are some computer terms
1. Computers are a series or group of electronic machines that consist of thousands or even millions of components that can cooperate with each other, and to establish a working system neat and meticulous. This system can then be used to implement a series of jobs automatically, based on the sequence of instructions or programs provided to him. source: http://medicalzone.org/fuldfk/viewtopic.php?t=2744
1. Computers are the result of the advancement of electronics and informatics technology that serves as a tool for writing, drawing, editing pictures or photos, create animations, operated a program of scientific analysis, simulation and controls equipment. source: http://www.total.or.id/info.php?kk=computer
1. Computers are a tool to support to alleviate human tasks in the process of solving various problems, because the equipment has operational speed, the ability to save memory, reliable, and cost savings. source: Book release Class XI ICT Main Widya
2. Computer classification
Computers can be classified based on the data processed, computer skills, capacity and size, and the problem area.
2.1. Based on the data processed, the computer classified into three, namely:
1. Analog computer
Is a type of computer that can be used to process qualitative data. Existing data is not a symbol, but still represents a state. Like for example: state of the temperature or humidity, altitude or speed is a condition that the computer was set to become a size.
Widely used analog-plant manufacturer s purpose to control or produce a product. Understanding analog computers more closely with the robotic or automated machinery.
2. Digital computer
Is a type of computer that can be used to process data that is quantitative (very many in number). Data from the digital computer is usually a symbol that has a specific meaning, for example: aphabetis symbols are depicted with the letters A s / d Z or dc / dz, numerical symbols are depicted with numbers 0 to 9, or special symbols, like: ? / + * &!.
3. Hybrid computer
Is a type of computer that can be used to process data and quantitative or qualitative. Hybrid computers can also be said to be a combination of analog and digital computers. Computers of this type are widely used by the various hospitals that are used to check the condition of the patient's body, which in the end, the computer can issue a variety of analysis presented in the form of images, graphics or text.
2.2. Based on its capabilities, computers are classified into three, namely:
1. Small scale computer
Computers of this type have a capacity of 64 Kb to 8 Mb and can handle dozens of separate computer terminals from the central computer.
2. Medium scale computer
This kind of computer system has a capacity of between 512 Kb to 8 Mb and can handle hundreds of separate computer terminals from the central computer.
3. Large scale computer
This kind of computer system has a capacity of between 512 Kb to 8 Mb, but this computer has a higher speed.
2.3. Based on the capacity and size, computers are classified into three, namely:
1. Microcomputer (Personal Computer / PC)
At first, this kind of computer system created to meet the needs of the individual (personal). Needs of individuals in terms of storing or processing the data, certainly not as much need for a company. Because of this, the capabilities and technologies owned by the Personal Computer in the beginning it was very limited.
With the benefits of relatively cheap, small shape and technology owned all supposed was sufficient, then the personal computer became so popular so quickly. Personal computers are now used not only by individuals but eventually it is widely used by companies to solve various problems existing in the company.
article from :http://rslant.web.id/perkembangan-prosesor-atau-hardware-komputer-lainnya.htm
article/the/the/development/of/computer/hardware/of/technology/for/our/life/new/product/indonesian