The Future of Computing Machines

amin_malik_ku@hotmail.comThere are announcements from chip makers stating that they have figured out new ways to shrink the ...

amin_malik_ku@hotmail.com

There are announcements from chip makers stating that they have figured out new ways to shrink the size of transistors.
During the last five decades smaller and faster semiconductor circuitry has fuelled an information technology boom producing cheaper and more powerful computing devices that have virtually boost every aspect of our economy. The result is the digital revolution that we see around us — the computers, TVs, laptops, tablets, smartphones, iPhones, iPods, video game systems, the Internet, social networks…the list is endless. The technological innovations and miniaturization are so fast that older computers and other devices are becoming obsolete almost as quickly as they come in the market for sale.
The invention of Transistors in 1947 is considered as one of the greatest technological achievements of the 20th century. The 3-terminal device provided an extraordinary capability to control an electrical current which was initially used for amplifying electromagnetic frequencies and then for switching the 1s and 0s needed for digital computing. With the advent of these transistors and the work in semiconductors, it became possible to envisage electronic equipment in a solid block with no connecting wires and this started integrated circuits (IC), in 1958. Fabricating an entire circuit on a silicon wafer or "chip" was a major breakthrough in size reduction with more components could be fitted into a smaller area. Now over the last 5-decades, we have been witnessing that these computer chips are becoming smaller, cheaper, faster and more powerful and efficient than its predecessors. This kicked off the solid-state era of electronics and high-tech revolution.
In 1965, Gordon Moore, the founder of Intel, noted that, the number of transistors per square inch on integrated circuits doubles after every 18 (or 24) months since their inception in 1958. This was an observation now known as Moore's Law.  In 1964, there were about 32 transistors on a chip measuring about 4 square millimeters. In 1971–Intel's first chip, 4004 processor had 2300 transistors. Now, today we can imagine that the latest 4th Gen Intel Core processor has 1.4 billion transistors and runs as 3GHz and built on a 22nm manufacturing process. In 2013, Intel planed another shrink to a 14nm process. Not only have the size of individual chips shrunk dramatically, now we have a system on a chip (SOC) technology in which one monolithic chip houses all the major components of hardware and electronic circuitry and parts for a "system" such as a cell phone, digital camera etc. This not only reduces system complexity, cost and power consumption, but it also saves space, making it possible to fit a high-end computer from yesteryear into a smartphone that can fit in our pocket.
The question is how long we can continue the miniaturization of chips i.e. fitting more and more transistors on an IC. There are announcements from chip makers stating that they have figured out new ways to shrink the size of transistors. But the truth is that we are simply running out of space to work with. First of all, the chip pathways certainly can't be made shorter than single molecules. We have to consider the fundamental limits imposed by the laws of thermodynamics and laws of quantum mechanics. When we cram more and more components on a chip, they become smaller and thinner. The components on today's microprocessors are now on the nanoscale (10-9 meter) and things at this scale become weird and the area of physics called quantum mechanics begin to take over which gives us a picture of the world that defies all logic of day today life.
Since transistors are all about controlling current flow in the ICs and as they get smaller and thinner, the tightly packed electrons start appearing in places they're not wanted — a phenomenon called quantum tunneling takes place. Though this quantum electron leakage doesn't produce a huge amount of current, but it does suck a bit of power and generate a bit of heat. That bit of heat times a billion transistors could melt the chip inside their packaging and puts a real spanner in the shrinkage works. Thus the conventional computing's progress is slowing down as it becomes increasingly difficult to cram more circuitry onto silicon chips. While processors have followed Moore's Law for some time, that progress has slowed and will continue to slow. There are predictions by researchers and scientists about Moore's Law ending around 2020 at the 7nm node.
What technology will dominate the Post Silicon Era? What will replace Silicon Valley? No one knows. Though the Scientists are now looking at the development of Quantum computers, molecular computers, protein computers and DNA computers etc., but it is sure that the next decade of computing is going to bring us gadgets and devices that today we only dream of. The next generation of computing machines may be able to simulate human intelligence and interface directly with the human brain and by 2030-2040 Computers could become more intelligent than human beings.
Over the last decade, Quantum computation has become an extremely exciting and rapidly growing field of investigation. It will shrink computers to the size of molecules equivalent to squeezing today's supercomputer on the head of a pin. It is being hailed as the future of data processing, with promises of performing calculations thousands of times faster than modern supercomputers. In this direction, increasing number of researchers with different streams, ranging from physics, computer sciences and information theory and mathematics are involved in researching the properties of quantum–based computation.

Related Stories

No stories found.
Greater Kashmir
www.greaterkashmir.com