Wednesday, March 26, 2008


I remember writing long back about how the ability to understand a language lends meaning to meaningless sounds. Today, let me write about how language has become a part of our learning.

Let me tell you about three specific words I heard during today's Microprocessor lecture - polling, interrupt and handshake.

For those of you who do not know of the meaning of these words in computer science context, let me just give you some very rudimentary definitions:

  1. Polling implies that the processor works in sync with devices by checking continuously for changes in the device status. Can you draw a parallel with a small, shy child or even an adult who doesn't really like drawing attention to himself and so, has to be constantly monitored/taken care of?
  2. Interrupt involves the device interrupting the processor whenever it needs something to be done. Can you draw a parallel with chronic attention-seekers here?
  3. Handshake involves continuous status exchange signals being passed in between the processor and the device to ensure smooth transfer of data. Can you draw a parallel with good communication associated with a successful relationship?

My point here is very simple. Learning mirrors life, which in turn, is represented by language. So, learning mirrors language. You can even put it the other way around and say that language mirrors learning.

The fact that we have well-formed languages makes things much simpler. This Apogee, we had a Spanish speaker lecturing about supercomputing. The distance and language barriers didn't hinder the exchange, simply because of the common language we spoke - of science translated into English, albeit with a very Spanish flavour for an accent.

Sometimes, I have an astounding sense of wonderment when I read something I read in a local book being referred to as the same thing in material originating from some other part of the world. For me, the extent to which the common language of learning has permeated the fabric of our species is simply awe-inspiring. They say globalization has brought about this trend of global learning, but could it not be that global learning has brought about globalization?

Monday, March 24, 2008

Are We Dead, My Precioussssss???

It's been so long since I keyed in my last post that I am not even going to give you readers an explanation for why the blog was dead for so long.

Blogosphere seems to have suddenly metamorphosed into 'blog-dom'. It's become more of a kingdom, fief or whatever else you choose to call it instead of a forum. But then again, that's just me and my cynical illusions...But the truth is, when there are no stories to tell, opinions are hard to manufacture and words are difficult to produce.

Amidst all the revelry which started with APOGEE (BITS' technical fest, in case you didn't know) and ended with Holi (or for me, with a twenty-four hour whirlwind trip to Delhi - a trip that involved philosophy, cynicism and lots of proof-reading, enough for a year actually) some things were lost...the most prominent among those being the regular seven courses I am currently doing this semester.

As I tried to shake off sleeplessness during my Civil labs today, I wondered about certain things. For example, about how instead of Chain surveying to find out the area of a given piece of land, we could take the area's satellite pictures (to scale of course) and calculate the area from that. A simple idea really - probably something that is already commonplace.

Which brings me back to my pet topic - the need for integrating computers into everything we do. Pervasive computing, in short. I never appreciated the concept more than when I came face-to-face with my microprocessor design assignment question. The reason? If we can design a microprocessor based computing system around a weighing machine, that's pretty cool, isn't it? As I like to say, it sort of brings things into perspective and falsifies my claim that we don't make use of automatic computing tools enough in our lives.

The truth is, I have never been happy with my computer's speed. I remember my first PC boasted the then latest 550 MHz Pentium III processor with 64 MB RAM and a 10.2 GB HDD. I vaguely recall naively thinking a year later as I regarded all the 1.2 GHz machines that they would be double as fast as my machine. Well sadly, I operate a 1.66 GHz Core 2 Duo powered machine with 1 GB RAM and 2 MB of L2 Cache, and it definitely isn't proportionately fast. Be it Windows or Linux, I haven't been able to extract the best performance out of my machine with the existing software I use. So is it the fault of the software or is it the user's limitation? Surely the hardware is not to blame...Or is it?

I don't know, but it seems to me that over the last three years, the computer world has undergone a massive paradigm shift when it comes to performance priorities. We saw it occurring when the focus shifted from microprocessor clock-speeds to power consumption; we saw it occurring when Intel abandoned its 30-stage pipeline to use a more modest 14-stage one; we see it happening even now, when Web 2.0 is re-defining the way we perceive the Internet (even though the change might be slower than we like.)

My only interpretation is that we have lingered on old ideas for too long. Maybe we have extracted all we could out of them and are now trying to extend their utility beyond their expiry date. New ideas are needed, if only to herald the death of old, seemingly precious ones. We need these new ideas to revitalize the way things are done. We need them, if only to have the old dying ideas asking, "Are we dead, my precioussssss?"