Friday, February 15, 2008

Change in Technology

Technology changes and evolves faster than anything in today's world, blink an eye and you see a new technology, framework, design pattern, tool or software out there. Hardly few decades back people were using punch cards to enter data although punch card itself is pretty old. Charles Babbage designed the first mechanical calculator using punch cards and soon a technology giant IBM came out with unit record processors for business data processing. IBM released first working computer model in 1950, now in 2008 only 58 years since the origin of this whimsical, magical and powerful hardware; we see miracles being performed using it.

Every 18 to 20 months, advances in technology virtually double the computing power $1 can buy. A statistical study shows that in the last 35 years computing performance has increased by a factor of a million and entry level price has decreased by factor of 1000. First came punch cards (total hardware), then computers (bundled with a software called operating system which made it seamless and transparent to access computers), then people started writing tools and applications using low level languages like C, COBOL, Pascal etc.Next came high level languages like Java, C++, Perl, etc which made it so much easier for the programmers to write browser (IE, Mozilla Firefox, Netscape ) based applications. All of these led to the evolution of World Wide Web (WWW) and took the world with a storm. People started using simple browser interface to access plethora of information available in the world.

People were using super computers (IBM mainframes) which took 1 room space to fit with vacuum tubes, then came desktop (mean looking machine that can fit on your desk), then came laptop which is portable and easily sits on your lap. Now we see people using palm tops, PDA's, and even smaller hand held devices. It won't be an overstatement to say that soon a wallet or pen size hardware will replace all the above.

First time I heard about computers when I was in 8th grade and was fascinated by its ability to make any complex task simpler. First program I wrote was in C (a very simple character array printing module), yesterday I was programming in Java and today I am talking about virtualization and agile framework with my colleagues. I wonder will something I am learning today become obsolete tomorrow?.

In such a rapidly changing environment a question always haunts me; how should I keep my resume up-to-date with all the buzz words and new technologies? I welcome your opinions about the same.

No comments: