It's been so long since I keyed in my last post that I am not even going to give you readers an explanation for why the blog was dead for so long.
Blogosphere seems to have suddenly metamorphosed into 'blog-dom'. It's become more of a kingdom, fief or whatever else you choose to call it instead of a forum. But then again, that's just me and my cynical illusions...But the truth is, when there are no stories to tell, opinions are hard to manufacture and words are difficult to produce.
Amidst all the revelry which started with APOGEE (BITS' technical fest, in case you didn't know) and ended with Holi (or for me, with a twenty-four hour whirlwind trip to Delhi - a trip that involved philosophy, cynicism and lots of proof-reading, enough for a year actually) some things were lost...the most prominent among those being the regular seven courses I am currently doing this semester.
As I tried to shake off sleeplessness during my Civil labs today, I wondered about certain things. For example, about how instead of Chain surveying to find out the area of a given piece of land, we could take the area's satellite pictures (to scale of course) and calculate the area from that. A simple idea really - probably something that is already commonplace.
Which brings me back to my pet topic - the need for integrating computers into everything we do. Pervasive computing, in short. I never appreciated the concept more than when I came face-to-face with my microprocessor design assignment question. The reason? If we can design a microprocessor based computing system around a weighing machine, that's pretty cool, isn't it? As I like to say, it sort of brings things into perspective and falsifies my claim that we don't make use of automatic computing tools enough in our lives.
The truth is, I have never been happy with my computer's speed. I remember my first PC boasted the then latest 550 MHz Pentium III processor with 64 MB RAM and a 10.2 GB HDD. I vaguely recall naively thinking a year later as I regarded all the 1.2 GHz machines that they would be double as fast as my machine. Well sadly, I operate a 1.66 GHz Core 2 Duo powered machine with 1 GB RAM and 2 MB of L2 Cache, and it definitely isn't proportionately fast. Be it Windows or Linux, I haven't been able to extract the best performance out of my machine with the existing software I use. So is it the fault of the software or is it the user's limitation? Surely the hardware is not to blame...Or is it?
I don't know, but it seems to me that over the last three years, the computer world has undergone a massive paradigm shift when it comes to performance priorities. We saw it occurring when the focus shifted from microprocessor clock-speeds to power consumption; we saw it occurring when Intel abandoned its 30-stage pipeline to use a more modest 14-stage one; we see it happening even now, when Web 2.0 is re-defining the way we perceive the Internet (even though the change might be slower than we like.)
My only interpretation is that we have lingered on old ideas for too long. Maybe we have extracted all we could out of them and are now trying to extend their utility beyond their expiry date. New ideas are needed, if only to herald the death of old, seemingly precious ones. We need these new ideas to revitalize the way things are done. We need them, if only to have the old dying ideas asking, "Are we dead, my precioussssss?"
3 Responses:
Could you please blog more often
:D.
More frequent posts from my side for sure.
i am all for new ideas.. without new ideas the old ones wont go away.. and the longer they stay without going away, the harder it becomes to ultimately dislodge them.
PS your laptop sucks
Post a Comment