Cringley has delayed his 2006 predictions to put up this thought on Google, given that they’ve made a splash at CES.
The key thing is that whatever Google does, it is enabling it to serve content targetted more granularly, and charge premium rates for that granular targetting. It already works with G-Mail, and other Google offerings. And, as IP and TV channels merge, this targetting will be sold into main media streams, not compete with them. targetted ads on your TV or future commercial media device. How granular can you get ? (Part of the capability to get close to doing this locally in real time is what is behind their massive distributed server containers reported earlier.)
Incidentally, Google’s beta-releasing of unannounced new products, and withdrawal if they fail, is visible as a designed strategy of “fast failures”, rather than over-hyped, disappointing late, white-elephant flops.
They aren’t afraid to try new things, and having tried them, also aren’t afraid to shut them down if they don’t seem to be working as intended. All of this is by design. Google has turned beta code into a weapon, creating “beta” programs that in the case of Gmail had more than three million testers signed-up before it went from beta to production. A beta test is a wonderful thing because it can be ended with a whimper but not with a lawsuit. Betas for Google are sometimes real statements of product direction and sometimes not, but Google competitors have no way of knowing which is which until they, too, have devoted resources to competing with something that may have no long-term existence.
As Cringley says, the speculation is not all his. Managed granularity has been an aspect of my info-modelling quest for the last few years.