Business Standard

Rajesh Jain: Computing's kumbh mela cycle

FUTURE TECH/ The next major computing breakthrough is just around the corner. What will it usher in?

Image

Rajesh Jain New Delhi
India's kumbh mela takes places every 12 years. It sees the largest gathering of pilgrims in the world bathe in the Ganga to purify themselves. The four locations where it is celebrated are supposed to be the places where the gods spilt "amrit" "� the elixir of immortality.
 
Twelve years or so is also the cycle for computing breakthroughs. By my reckoning, the next major computing breakthrough is just around the corner.
 
What will computing's next kumbh mela usher in? On this hangs the fate of more than 500 million "devotees," a billion aspirants, hundreds of billions of dollars in spending and the future of today's tech giants.
 
Before we look ahead, let us take a peek into the past of computing and look at the previous computing kumbh melas. An understanding of the past is useful even as we peer into the future.
 
1945 saw the invention of the world's first computer, the ENIAC (Electronic Numerical Integrator Analyzer and Computer). PBS.org has more: "ENIAC, with its 17,468 vacuum tubes, 70,000 resistors, 10,000 capacitors, 1,500 relays and 6,000 manual switches, was a monument of engineering. The project was 200 per cent over budget (total cost approximately $500,000). But it had achieved what it set out to do. A calculation like finding the cube root of 2589 to the 16th power could be done in a fraction of a second. In a whole second ENIAC could execute 5,000 additions, 357 multiplications, and 38 divisions. This was up to a thousand times faster than its predecessors...ENIAC's main drawback was that programming it was a nightmare. In that sense it was not a general use computer. To change its program meant essentially rewiring it, with punchcards and switches in wiring plugboards. It could take a team two days to reprogram the machine."
 
In the late 1950s, IBM switched from using vacuum tubes to using transistors. VisionEngineer.com writes: "Vacuum tubes are large, expensive to produce, and often burn out after several hundred hours of use. As electronic systems grew in complexity, increasing amounts of time had to be spent just to ensure that all the vacuum tubes were in working order. Transistors, in comparison, rarely fail and are much cheaper to operate."
 
In the same year, IBM also introduced Fortran (FORmula TRANslation), a programming language based on algebra, grammar and syntax rules, and which went on to become the most widely used computer language for technical work. These twin breakthroughs made computers reliable and easily programmable.
 
In 1969, IBM changed the way it sold technology. It unbundled the components of hardware, software and services and offered them for sale individually. This is what gave birth to an independent software industry.
 
1969 also saw the setting up of the Arpanet, which later grew to be the internet. 1970 also saw the birth of the first general-purpose microprocessor "� the Intel 4004. Unix began its life around the same time, as did the programming language "C".
 
1970 was also the year when the theory of relational databases was introduced by Ted Codd at IBM. Taken together, these developments in semiconductors, software and networks laid the foundation for modern-day computing.
 
1981 saw the launch of the IBM personal computer. This is from the IBM archives: "[It] was the smallest and "� with a starting price of $1,565 "� the lowest-priced IBM computer to date. The IBM PC brought together all of the most desirable features of a computer into one small machine. It offered 16 kilobytes of user memory (expandable to 256 kilobytes), one or two floppy disks and an optional color monitor. When designing the PC, IBM for the first time contracted the production of its components to outside companies. The processor chip came from Intel and the operating system, called DOS (Disk Operating System), came from a 32-person company called Microsoft." The rest, as they say, is history.
 
IBM's decision to source the two key components from external suppliers led to the modularisation of the computer industry, and the emergence of Intel and Microsoft as its two superpowers. In 1982, Time magazine chose the personal computer as its Man of the Year.
 
The period of 1992-94 saw many key developments which have shaped our present. Microsoft launched Windows 3.1, which rapidly became the standard desktop interface for millions. Around the same time, Intel started shipping its Pentium systems. The duo's dominance led to the coining of the phrase "Wintel".
 
SAP launched its enterprise software program, R/3, which established the client-server paradigm. The internet's commercialisation and proliferation got a major boost with the launch of Mosaic, a graphical web browser based on the HTTP and HTML standards, by Marc Andressen and his team at the National Center for Supercomputing Applications in the US.
 
Computing has come a long way since the development of the first computer in 1945. Even though innovation has happened in an almost-continuous manner, every 12 years or so a paradigm shift takes place that blows out the old and rings in the new.
 
So, the next computing kumbh mela should happen sometime soon (or is already underway). What is it going to be? Microsoft's Longhorn? Google as the supercomputer? Cellphones as always-on, always-connected computers? Utility computing? Wearable computers? Something unseen as of today? We will take a look ahead in the next column. l
 
Rajesh Jain is managing director of Netcore Solutions Pvt Ltd. His weblog is http/www.emergic.org.

He can be contacted at
rajesh@netcore.co.in

 
 

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Aug 11 2004 | 12:00 AM IST

Explore News