Category Archives: Tech Industry

Gmail Mass Email Deletions

Michael Arrington (and others) reported yesterday on a problem with GMail as described here: Gmail Disaster: Reports Of Mass Email Deletions. Regardless of how this incident ultimately turns out, and without assigning neither blame nor praise to Google or anyone else who may or may not be involved, this is an incident that everyone interested in the future of network computing needs to take to heart. If I had been a guest on the (apparently now defunct) Gillmor Gang when Steve Gillmor launched into one of his “rich client is dead, long live the network” diatribes, I would have responded with something along the lines of this:
“I predict that sometime within the next year or two, there will be some kind of major incident–a serious security breech, a significant service outage, an accidental or deliberate release of data, a NOC screw-up, a government investigation, a service provider buyout or bankrupcy, whatever–that will cause anyone interested in moving to a thin-client/network-centric computing model to seriously reconsider their plans.”
This GMail incident may turn out to be nothing, but consider all of the other incidents that have happened in the past few years: the AOL user data release, the security breech at that credit card processing company, the brief service outage at salesforce.com, the government porn investigation (need to find citations for these). Considering all this along with what can be easily imagined in the future, are corporations really going to want to entrust some of their most sensitive data to third-party service providers whose behavior and business practices are completely outside of their control? Will individuals?
It’s worth remembering that we once operated on a centralized computing model based around mainframes, and we moved away from that model for good reasons (single point of failure, service degradation with increased usage, etc.). While there are significant benefits to centralized computing, there are significant risks and drawbacks as well. The same can be said for the decentralized, client-based model as well.
IMHO, the best approach would be a hybrid model in which data formats and communications protocols are open and standardized, data can reside either on servers or local client machines and can be easily and transparently moved or synchronized back and forth between the two as needed, and the applications used to view and edit that data can be either client-based, server-based or both. This way, individuals and corporations can choose the level of centralization that they are comfortable with, and everybody wins. Except, perhaps, those companies interested in selling you servers (Sun) or thick-client operating systems (Microsoft, Apple).

Intel Sheds XScale Processor Unit

Intel sells handheld chip unit to Marvell
(AP)

AP – Intel Corp. said Tuesday it will sell its division that makes processors for handheld gadgets to Marvell Technology Group Ltd. for $600 million in cash, as the world’s biggest semiconductor maker focuses on its main business of supplying chips for PCs and computer servers.
There are basically two ways of looking at this: (1) is that Intel has yet again lost patience with a strategic investment that it has made that hasn’t immediately yielded at multi-billion dollar revenue stream; and (2) is that Intel has plans to migrate it’s x86 architecture down into a mobile form-factor, and it wants to rid itself of a competing architecture before it does so. The part of me that once believed in Intel’s strategic competence wants to believe the latter, but the part of me that sold all my Intel stock leans toward the former.
Although Intel doesn’t break out the performance of the division, analysts said it remained unprofitable as Intel overestimated its ability to break into a business that was outside its core competence.
Funny, I remember a time when designing, manufacturing and selling chips was Intel’s core competency.