How did the Microchip Change Computers during the 1990s

Posted on

How did the Microchip Change Computers during the 1990s?

It is no secret that your current computer is the fruit of evolution that took decades to get where it is – and is still far from reaching its end. If we think about ten years ago the processors didn’t even know the multiple nuclei, imagine the machines that inaugurated the computer is an even more complicated task.
Did you know that in early 1950 there were computers already? Logically, they did not look a bit like what we have today, but they have already conducted some complex calculations in very little time. In these 60 years, elements have vanished, components have been created, and it seems until we are talking about totally disconnected subjects.
So get ready to know a little more about this magnificent story. To facilitate reading, we meet the divisions of some authors specialized in the subject and we have split the history of informatics into generations. Now take advantage of learning more or to know the important evolution of computers.

How did the Microchip Change Computers during the 1990s

The giant first-generation valves

Imagine what your life would be like if you needed a huge room to store a computer. Logically that would be impossible, because the first computers, such as ENIAC and UNIVAC were destined only to function calculations, being used to resolve specific problems.
Why specific problems? The first generation computers did not count in a standard programming language. That is, each machine owned its code and, for new functions, it was necessary to completely reprogram the computer. Do you want to change the calculated problem? Reprogram the ENIAC.

These gigantic computers still suffered from constant overheating. That’s because instead of microprocessors, they used large electric valves, which allowed amplification and signal exchange, through wrists. They operated in a way correlated to a circuit board, each lit or erased valve represented an instruction to the machine.
With a few hours of use, these valves were burnt and demanded replacement. So each year they were exchanged about 19000 of them in each machine. Yes, 19000 valves represented more than the total components used by an ENIAC computer. As you can see, these computers don’t come cheap to the owners.

Transistor and reducing computers

The giant machines were not being profitable because of constant maintenance expenses. The main necessity was to replace the electric valves with a new technology that would allow more discreet storage and not be so responsible for generating excessive heat, avoiding overheating.
It was then that the transmissions (created in 1947 by the company Bell Laboratories) began to integrate the panels of the computing machines. The components were created from solid materials known as “silicon.” Exactly, the materials used to date on plates and other components, extracted from the abundant sand.

There was a series of advantages of the translinear about the valves. To begin with: the dimensions of these components were fairly low, making the second generation computers a hundred times smaller than the first ones. Furthermore, new computers have also emerged more economical, both in energy consumption and in parts prices.
For the commands of these computers, the machine languages were replaced by assembly language. This type of programming is used until today, but instead of being used for software or operating systems, it is more frequent in hardware component factories, for working with more direct instructions.
Quite different from the 30 tons of ENIAC, the IBM 7094 (the most successful version of the second generation of computers) weighed only 890 kg, and, as much as it seems little, that same machine surpassed the brand of 10000 units sold.

Curiosity: The second generation computers were initially developed to be used as control mechanisms in nuclear power plants. A similar model can be seen in the drawing “The Simpsons,” more specifically in Homer’s workstation, security technician at the nuclear power plant.

Miniaturization and integrated circuits

The use of silicon materials, with electrical conductivity greater than that of an insulator, but smaller than that of a conductor, was called a semiconductor. This new component has assured significant increases in the speed and efficiency of computers, enabling more tasks to be performed in shorter periods of time.
With the third generation of computers, the keyboards have emerged for typing commands. Monitors also allowed the preview of very primitive operating systems, still completely distant from the graphics systems we know and use today.

Despite the facilities brought by semiconductors, the computers of this generation were not reduced, and one of the most successful models (the IBM 360, which sold more than 30000 units) would weigh more than the predecessors. At this time (end of the decade of 1970 and early 1980), computers became more affordable.
Another breakthrough of the third generation was the addition of the upgrade capability in the machines. Companies could buy computers with certain configurations and increase their capabilities according to necessity, paying relatively little for these facilities.

Microprocessors: The beginning of personal computers

We finally get to the computers that much of the users use to this day. The fourth generation computers were the first to be called “microcomputers” or “Micros.” This name is because they weigh less than 20 kg, which makes it easy for storage.
Can you imagine which component made it possible to reduce the machinery? You hit who said it was the microprocessors. The emergence of small control and processing chips has made computing much more affordable, besides offering a huge range of new options for users.

“Microcomputers” had that name for weighing less than 20 kg

In 1971, processors were already created with this new format, but only in the mid-decade began to commercially emerge the first personal computers. The Altair 880 could be purchased as a mounting kit, sold by specialized magazines in the United States. It was based on this machine that Bill Gates and Paul Allen created “basic” and inaugurated the Microsoft Dynasty.

This second version of the computers owned a modified version of the basic system, created by Microsoft. The great breakthrough presented by the system was the use of the graphical interface for some software. It was also possible to use text processors, spreadsheets, and databases.
The same Apple was responsible for the inauguration of mice in personal computing, along with the graphical operating systems, such as the Macintosh. Shortly after that, Microsoft launched the first version of Windows, quite similar to the rival’s system.

Cycles become clocks

Until the third generation of computers, the machine’s response time was measured in cycles. That is, some actions were measured in short periods of time so that it could be possible to know which fraction of the second was used for them. With microprocessors, it was no longer feasible to measure the capabilities that way.
That’s why the clocks appeared. This setting calculates the number of processing cycles that can be accomplished in just one second. For example, 1 MHz means that in only 1 second it is possible that the chip performs 1 million cycles.
Most of the personal computers launched at that time were powered by Intel Company processors. The same intel that today owns some of the most powerful chips, such as the Intel Core i7 (which we’ll talk about more soon). As you may know, these machines are very lightweight and could be taken to a new plateau.

The importance of Apple

At the same time, Apple’s two steves (jobs and Wozniac) created the Apple company to dedicate themselves to personal computing projects facilitated to lay users. So came the Apple I, project that was first introduced to HP. It was succeeded by Apple II, after an injection of 250,000 dollars by Intel.

Notebooks: Portable Generation

Considering the progress of computing as being inversely proportional to the size occupied by the components, it would not be odd that the computers were transformed into portable parts. The notebooks emerged as luxury objects (as were the computers until just over ten years ago), is expensive and small business coverage.

In addition to notebooks, we also have netbooks available on the market, which work in a similar way to others, but usually, have less attractive dimensions and configurations. They earn points for the extreme portability and duration of the used batteries and is certainly one step further in the evolution of computers.
Today, the price to be able to take the documents, files, and programs to all places is not much higher than that charged by desktops. Even so, the market is still far from reaching its peak. Who knows what the next step in the industry will be?

Multiple nuclei: the 5th generation?

We are still transitioning from a phase where the processors attempted to reach increasingly higher clocks for a phase in which the real matter is how these clocks can be better utilized. It is no longer necessary to achieve processing speeds higher than 2 GHz, but it has become mandatory that each chip has more than one nucleus with these frequencies.
The processors that simulated the existence of two processing nuclei came to the market, then the ones that actually presented two of them. Today, there are four-core processors, and others, used by servers, which already offer eight. With so much power performing simultaneous tasks, a new necessity emerged.

Green processing

It is known that the higher the number of tasks being executed by a computer, the more electric power is consumed. To combat this maximum, chip manufacturers have been searching ways to reduce consumption, without diminishing their capabilities. It was then that the concept of “green processing” was born.
For example Intel Core Sandy Bridge processors were manufactured with reduced micro-architecture, causing clocks to be shorter and less electric power to be worn out. At the same time, these processes are more effective. Therefore, performing tasks with this type of component is good for the user and also for the environment. It’s worth saying that also applies to Ivy Bridge, Skylake, and all the other families.

Another element involved in these conceptualization is the process of assembling. Manufacturers are constantly seeking ways to reduce the environmental impact of their industries. Notebooks, for example, are being created with LED screens, much less harmful to nature than ordinary LCDs.

We don’t know yet when the sixth generation of computers will emerge. Some people consider artificial intelligence as this new generation, but there are also people who say robots are not part of that denomination. However, what matters is realizing that, over time, the man has been working to improve his machines increasingly.
Who imagined, 60 years ago, that one day it would be possible to carry a computer in the backpack? And who, today, would imagine that 60 years ago would require a train to carry one of them? Today, for example, there are already quite powerful smartphones.
And for you, what will be the next step in this evolution of machinery? Take advantage of the comments to say what you think about these improvements provided over the decades.


Leave a Reply

Your email address will not be published. Required fields are marked *