Analysis of IT news

Wednesday, February 18, 2009

How netbooks are heading the wrong way

Netbooks have become very popular this year, so much that it's the fastest growing segment of PCs for this gloomy year.

But netbooks have already started to head in the wrong direction. As I already wrote, people have a tendancy to see new technology as a newcomer to an existing market instead of a market of its own (Seeing a replacement when there's none)

And netbooks are no exception. For instance, HP's netbook is shipped with Windows Vista. Yes, resource-hungry Vista on a very lightweight PC. But don't think that Linux-based netbooks are any better: they all have Windows envy and all feature applications such as a word processor.

The problem is that PC manufacturers are thinking like their customers. That is, their *current* customers, not the potential customers - those who don't have any laptop. And what do those current customers want? To be able to run their favorite apps. They're craving for faster netbooks, larger screens, etc. I'm sure pre-installing Vista was a no-brainer for HP. Other netbooks are shipped with Windows XP, theirs have Vista. Isn't that a competitive advantage? You will also notice that netbook Pioneer Asus Eee PC has been becoming more powerful, larger and more expensive as time went by. So much that the latest model looks almost like a regular notebook.

In other words, any true innovative netbook will come from companies that are *not* PC manufacturers. Only such a company can see the true potential of a netbook: being an Internet appliance more than a computer. Something that doesn't try to run Microsoft Office, that boots instantly and that hides all the complexity of a computer - just like smart phones are doing.

It's no surprise then that the first ray of hope came from chipmaker Freescale who just announced a $200 netbook for later this year. Instead of using an Intel Atom like all the other netbooks they're using their own ARM-based chip which combines processor, graphic chips and memory bridge. As a result their netbook will be both very thin and consume very little power (their chip allegedely consumes 3 watts as opposed to 14 watts for the Intel Atom chipset). They of course had to sacrifice Windows compatibility, something PC manufacturers will never do.

I wish Apple was announcing a netbook as they will only enter this market if they can release something that will blast what's already out there - just like they did with the iPhone. But they might not want to do so if Steve fears it cannibalizes their entry-level laptops. The jury's still out.

Tuesday, February 17, 2009

On Singularity

There's been renewed interest lately in so-called Singularity. Singularity is the moment when computers will be as powerful as a human brain. When this day arrives computers will be intellectually superior to humans.

I'll be upfront: I don't buy it - not in the nearby future at least. As a matter of fact, Singularity reminds me of all the buzz around Artificial Intelligence from the '70s and '80s.

I don't buy it not because it sounds unbelievable. If in 1950 someone had predicted how powerful computers would be today and how they would be used, he would have sounded crazy.

I don't believe in Singularity as it's presented to us because the model used by its advocates is fundamentally wrong. Their argument is the following: nowadays we can simulate in a computer how several neurons interact with each other. Following Moore's law, computer power grows exponentially so we should eventually - and sooner than we think - come up with computers that can simulate an entire human brain. Some advocates believe Singularity will be reached withing the next couple of decades.

I however see one flaw in this theory. Put a few neurons together and you indeed have basically an electronic circuit easy to simulate. But when you put billions of neurons together - enough to make a human brain - something weird happens: a conscience emerges. Call it a soul, a spirit or whatever you want but no one can deny there is *something* that appears and which has emotions - something not taken into account at all by any computer model. Is the conscience a side effect of electricity? Of a chemical reaction? Something else? Nobody right now has *any* clue about what really conscience is, why and how it appears.

I'll use a parallel: Newton's laws of motion have been very useful to humanity - they are a cornerstone of modern mechanic. But Newton's model also says we can reach any speed we want provided we provide the necessary energy. Except that reality is more complex than what Newton imagined, and past roughly 100,000 km/s his model doesn't work anymore. That's when we need to switch to another model to predict movement accurately: Einstein's relativity theory.

Well, something tells me that the neuron model we know works for a small amount of neurons, but past a certain critical mass it's not valid anymore and we need a new model. Until then trying to predict when Singularity happens is just ludicrous.

I'll end with another story. Back in the 50's, Jon von Neumann thought that computers would help us not just predict but *control* the weather. His argument was that any system contains points of unstable equilibrium, and that you can control the states of that system by acting at these points. Imagine a ball on the top of a hill that you can make go in any direction you want by giving an appropriate small nudge. Von Neumann's thinking was that it was just about finding out the unstable equilibrium points of the atmosphere and we would know how to control the weather. It was "just" a matter of number crunching. Except that von Neumann's model was flawed. He indeed didn't realize that *all* the points of the atmosphere are unstable equilibrium points (the famous butterfly effect). Once again reality turned out to be much more complex than what he imagined. Half a century later computers are more powerful than von Neumann could have ever imagined but we still can't reliably predict the weather a week in advance. Just a matter of number crunching, uh?

And guess what? "It's just a matter of number crunching" is exactly what Singularity advocates are telling us. Could reality be more complex than what they imagine?

Monday, February 16, 2009

The Web, 15 years later

Roughly 15 years after the World Wide Web became mainstream it's interesting to look back at the predictions of then.

The outcome? Our predictions were wrong most of the time. We all realized at an early stage that the Web would forever change a lot of things, but we missed the mark on pretty much everything else.

It's not Netscape who turned out to become the Internet giant but Google. People still don't buy their groceries online - they overwhelmingly stick to their supermarkets - but they now look on Craigslist rather than newspaper ads when looking for houses (at least in the U.S.), a deadly blow to local newspapers ad revenue. Some companies like Amazon or eBay leveraged being the first comers in their respective markets, whereas latecomers such as Google or Facebook were able to displace the incumbents.

Let's first look at a few false predictions:

  • Myth: the end of the middleman. The Internet was supposed to get rid of the middleman - buyers could go directly on seller's websites, cutting the middleman.
  • Reality: if the Internet has removed the difficulty of physically dealing with a business, it has introduce other hurdles: 1) finding the said business among the millions of other websites and 2) trusting the site enough to give them your credit card number. Face it: Google, eBay, Amazon or PayPal are middlemen, and they're stronger than ever.
  • Myth: the rise of the network computer at the expense of the PC.
  • Reality: over the last decade we've seen a huge improvement in the Web interface capabilities thanks to technologies like AJAX or Flash. As a result, users replaced some of their fat clients by a Web counterparts. But "some" doesn't mean "all". As a matter of fact, the Internet has created a whole class of new applications: IM clients, peer-to-peer clients, you name it. Even Google has released fat client applications such as Google Earth to have an edge. If there is a need for Internet appliances they won't replace Windows PCs anytime soon.
  • Myth: the end of brick and mortar.
  • Reality: there are some goods for which users are not ready to buy online. For instance, buying your groceries online just didn't catch up. Several brick and mortars stores have been hurt though, as a lot of customers go to the retail stores to check out the products but buy online the same product for cheaper.
  • Myth: the rise of an independent press and "citizen journalism". With the rise of blogs one could have thought that news would become more and more independent.
  • Reality: 99.999% of the blogs rely exclusively on information available on the Internet. So even thought they might have no strings attached, they nonetheless rely on information published by the Associated Press and alike.
  • Myth: the "push" model. Companies like PointCast advocated the "push model" where the information is "pushed" to the customer, as opposed to being "pulled" whenever they want.
  • Reality: the "push" model was a buzzword for a year or so. The closest form of push model is RSS - even though it's far from being used that much. The difference? RSS is a simple, open format.

Unpredicted outcomes

- The rise of user-provided content: the initial model of the Web was that websites owners would provide most of the content. Sure, in the 90's people had fun creating their own Web pages on Geocities et al, but the process was rather cumbersome and thus limited. But things got simpler and simpler. Blogs let pretty much everybody write their own thoughts. YouTube let people share their amateur videos. Wikipedia let everyone build a part of a worldwide encyclopedia. The content might be amateurish compared to sites that provide professional content (think of it as reality TV compared to traditional TV shows), but websites that capitalized on that trend sometimes reaped huge benefits. Ask the founders of YouTube, sold to Google for a couple of billion dollars.

- A rise of interactions among users: just as users provide a lot of content, some websites have enjoyed a tremendous success by becoming a mere medium between users - where the interaction between users become more important than content itself. That started with Web-based email clients (Hotmail), instant messaging services (ICQ), eBay or even peer-to-peer. One special class of very successful sites is networking sites. Friendster, MySpace, Facebook, LinkedIn, but also all the dating sites where people are ready to pay a monthly fee - almost unheard of on the Websphere. This shouldn't come as a surprise. After all, the first killer app on the Internet was the email, back in the 60's.

- Services we thought were commodities turn out not to be commodities after all. When Google started, the Web search space was already crowded. Yahoo!, AltaVista, Lycos, Excite, you name it. And nobody thought this was a particularly hot space. It's just about indexing a lot of pages, right? That's what Eric Schmidt thought before meeting the Google boys. So the lesson is: something you might think is trivial might become a hot market (for further information see also When will new giants appear).

- The rise of malware: up to 1993-1994 the Netiquette was rather strictly enforced. Post something wrong on forums and you would get quickly scolded by the community - when not viciously flamed. But when the Web made the Internet go mainstream and that monthly new users outnumbered old users, the Netiquette flew out the window. It started with spam, but went worse and worse as time went by. Worms, email viruses, phishing, you name it.

Lessons

- You never know when and where will the next giant rise. In 2003 the Web space seemed quite crowded already, yet companies like YouTube or Facebook came out of nowhere to become some of the hottest sites of the Web. One thing though: we might see less new giants. There might be a lot of entrepreneurs who want to build the next Google, but VCs would prefer to finance the next YouTube - you make big bucks much faster this way.

- Users interactions are prime. The first killer app on the Internet - way before the Web - was email. Email's been somewhat replaced by instant messaging or texting but the concept is the same. And some of the hottest websites or Internet apps these days are just a medium favoring users' interactions: networking websites (Facebook, MySpace), video (YouTube), auction (eBay) or P2P applications.

- There are some basic human desires that will always sell. Dating sites might be the sites that have the least problem charging their users. Likewise, porn sites are a big revenue-maker. This shouldn't have come as a surprise. In France, during the Minitel heydays, adult sites were a huge part of the Minitel economy. But beside sex, people have basic need they want to fulfill so never forget this. And guess what? Human interaction is one of them.

- Any technology will be used for bad purposes.

- Take any prediction with a grain of salt because most of them end up being dead wrong. We tend to be overoptimistic about upcoming "revolutions" that never happen (the push model, the so-called new economy) - and at the same time we're taken by surprise by some other technology (the rise of Google).

Sunday, February 15, 2009

Amazon, the Kindle 2 and eBooks

Amazon's release of the Kindle 2 has generated quite some much-needed excitement in this tough economic times - especially among members of the struggling printed press.

But eBooks are still in their infancy and are hitting the very same problems the music industry has hit. Except it hasn't fixed the problems yet.

1) Forced bundles (nagazines and newspapers only)

Remember how several music majors were initially reluctant to sell individual songs and wanted only to sell a whole album? They forgot the reason why album exist is physical constraints of the hard media. There is no rationale not to sell individual tracks separately but greed (the "creativity" argument is just BS).

And guess what? We see the same thing with electronic newspapers. The Internet got us used to ready a few articles from a wide variety of magazines, but eBooks still force us to buy the whole magazine. If you had to purchase all the magazines where you read just a few articles from the Internet it would add up significantly.

What should be done here? My thought would be a plan a bit like cell phone plans. Instead of a quota of minutes people would purchase quotas of words worth of articles. Easy, and you don't have to wonder whether you want to purchase an article.

2) Proprietary standards

Perhaps the biggest battle between music enthusiasts and music majors was over DRM. For years the music industry had been clinging to sell copy-protected music - never mind people can easily illegally download any music they want and *choose* to legally buy songs. It's only after years of failures - and the desire to sell songs for the iPod without going through Apple - that the music industry reluctantly agreed to sell unprotected songs. This tremendously helped online music sales as the consumer now isn't tied to their MP3 player.

Once again, the printed edition hasn't gone over the same fear and clings to its own DRM. As a result, if you want to migrate from, say, a Kindle to a Plastic Logic a few years down the road, you must also part with all your electronic purchases.

For this, the publishing industry will need to suck it up and let go of this stupid DRM. No Digital Right Management system has ever prevented anybody from making illicit copies.

Saturday, February 14, 2009

Microsoft to open its own retail stores

News item: Microsoft plans to open its own retail stores chain

Analysis: If it's successful for Apple, then it's logical to see Microsoft follow. A lot of people wonder about the timing. On the one hand Microsoft is trying to cut costs so launching a retail store chain seems like a bad time. But on the other hand, considering the number of retail chain closing, malls are probably anxious to see anybody move in. Redmond should be able to strike good lease deals. Remember, Microsoft always thinks long term.

Now, contrary to Apple, Microsoft retail stores are less a sales channel than a marketing channel. Apple indeed sells hardware so they sell mostly through retail stores anyway. As a result, Apple makes a much better margin on the products it sells through its own stores. For Microsoft it's the opposite - except maybe for the Xbox and the Zune. Because they are mostly selling software, Redmond is right now enjoying the best deal they're ever going to get: they market to billions of consumers but sell at very low cost to a handful of PC manufacturers, leaving the latter go through the trouble of installing and supporting Windows. How do you think Microsoft gets a 80% profit margin on its two cash cow products? There's no way Redmond can make better margins selling Windows directly to consumers.

But Microsoft's focus here is to regain its mojo and coolness. Aside from the Xbox, Microsoft products have failed to generate any excitement for several years now - when they didn't become a PR nightmare like Vista. That's not a good thing in the long term, even if in the case of Windows where they have the market pretty much locked up.

Back in the good old days all Redmond had to do was to focus on nurturing relationships with a limited number of big influencers: PC manufacturers (starting with IBM), journalists, etc. If Microsoft managed to seduce them, consumers would blindly embrace new product Redmond released (remember Windows 95?). Small cost, huge effect. But this approach doesn't work anymore and now customers aren't automatically switching to Vista just because Steve Balmer said so - let alone buy a Zune or a Windows-powered phone. Ironically Redmond is now suffering from the same problem its competitors suffered a few decades ago: its products like the Zune might be good but it doesn't matter because 1) few people have heard about them and 2) it's not compatible [with all the peripherals available for the iPod]. So Microsoft now has to send the foot soldiers - in the present case an army of retail store reps - to sell the company to consumers.

The key question is thus whether Microsoft will be able to attract people to its stores and regain some coolness. Dell and Gateway tried to come up with showrooms and failed. Bringing a Wal-Mart guy on board to head the retail store division will sure bring some operational experience, but what about the sex appeal of the stores? Besides, stores alone aren't enough if the products aren't sexy. The success of Apple stores was that consumers *wanted* to know more about Apple products in the first place because they are sexy. They wanted to see the iPhone and the MacBook Air, even if they had no intentions of buying one. Can Microsoft do the same?

Saturday, February 07, 2009

Microsoft's problems might come because they listen to their customers

It's not the first time that Microsoft has released a mediocre operating system (it will probably never match the terrible MS-DOS series), but Vista has been a PR nightmare for Redmond. Some analyst claim that Microsoft's recent bad quarter was partly due to Vista, a lot of people refusing to upgrade. And to add insult to injury Apple has been releasing several commercials micking the Vista fiasco.

Now, that might come to a surprise to you but I do think one of the reason of Microsoft's problems has been that they have listened to their customers.

Excuse me? Since when is listening to your customer a bad thing?

Actually, more often than you might think. You might remember Digital CEO Ken Olson's infamous 1977 quote: "there is no reason for any individual to own a computer at home". That sounds pretty ridiculous now (all the more so that Digital eventually got acquired by Compaq, a PC manufacturer) but in 1977 if Digital has asked its customers what they wanted, they would have said they wanted faster computers with more memory and more disk space, not a small computer with 16Kb of RAM that can fit on a desk. It's only when they saw the possibility of personal computing that they decided it's what they wanted.

Yes, asking customers for their opinion is harder than it looks. Countless companies run customer focus groups, determine what their customers want, release the product they asked for... only to see it fail. The problem is: if customers are very good at knowing if a product suits their needs, they generally are very bad at defining their needs. So listening is only part of the equation and the execution (what you do with customer feedback) is as important a step.

For one thing, customers feedback is highly influenced by their frame of reference. Consider the Mac's magnetic power plug which prevents a laptop from flying if someone trips on the power cord. Would such a feature be asked by any customer focus group? Unlikely. Sure, some customers might have hit the problem but they would dismiss it as a general issue common to all electric appliances work. In other words, you're not going to be innovative by listening to customers. Like Henry Ford famously said, "if I had asked my customers what they wanted, they would have answered a faster horse".

The other problems is that customer's requests often contain a lot of implicit assumptions. If customers say they want, say, a faster computer they often implicitly mean "without changing anything else". In other word, you increase the computer speed, but not the price nor the noise or power consumption. Note that this is not specific to computing or even business. Of course in 2003 Americans wanted to free Iraq and get rid of Saddam Hussein. But their implicit assumption was that this would be done swiftly and with minimum American casualties.

Now let's get back to Microsoft. In the 80's and 90's if they had asked their MS-Office customers what they wanted (which I'm sure they did), what would have they heard? Probably "more features". One customer would have asked for feature X, another for feature Y, and so on... What does an engineer do in that case? He implements features X, Y and adds feature Z for good measure. The more features the better, because then everybody is satisfied, right? And when Office started to be hit with a bad case of feature creep, what did people complain about? That Office was too difficult to use. Microsoft could have then reduced the number of features, but for each feature that got the axe it would have alienated the few customers who asked for it. It could have come up with an "Office Lite", but this would have cannibalized sales of the full-fledged Office, one of Redmond's two main cash cows. So what they did was to add an extra feature to explain all the previous features: the infamous paper clip assistant.

Microsoft's problem in other words is that they have listened to their customers with a poor follow-up execution. The execution was influenced both by the engineering culture of "the more features the better" and the financial influence.

Now consider Vista. With this release of Windows, Microsoft has tried to address two long-standing problems: improving security and trying to keep up with Mac OS X graphical interface. Two noble goals.

The problem was, like always, in the execution. In order to come up with Vista Aero look Microsoft had to bloat the Windows code. As a result Vista consumes *much* more resources that Windows XP. And that's how, by trying to improve one aspect of Windows (the look) Microsoft created one of the big criticisms of Vista: being a resource hog. Of course customers would like Windows to look as sexy as Mac OS X. But not if this means slowing everything down.

Same thing with security. Microsoft indeed came up with the infamous User Account Control. This feature indeed turned out to be so annoying most people turn it off - effectively disabling any security benefit of the feature.

But as far as security goes a more serious problem been happening for years and crap hit the fan with Vista. For years indeed Windows API has been growing tremendously. One of the reason has been to keep Windows developers busy. You want to be up to date? Gotta spend your time constantly learning Microsoft technology - no time to learn anything else. But another is probably because Microsoft tried to please its developer community. You want an API to be able to do this? Sure! You wish for another API to do that? No problem!

The problem is that not only it drags performance down but it is also a security risk. Hacking a computer is indeed often about combining separate features in a way the developer didn't think of. The more features the greater the chance you'll find a combination to hack a computer. For example a lot of viruses are connecting to Outlook through Visual Basic (yes, Microsoft made it possible), go through its address book and spam the victim's contacts. To be fair, a lot of malware exist on Windows just because it's so ubiquitous. If Mac had 90% of the market you would hear about much more security flaws because that's the operating system hackers would focus on. But Microsoft didn't make the job easier by bloating the API of its operating system.

So in order to improve security Microsoft had to make some significant underlying changes. This means actually *removing* functionality. Computer programs could talk to each other easily using shared memory? Not anymore! With Vista they need to use a more cumbersome API involving some security descriptors - yes, Redmond made the process of writing Windows applications more complex by *removing* features. If this enhanced security it is also at the source of the other big criticism of Vista: a lot of software and peripherals didn't work anymore.

So what could Microsoft do? Unfortunately their options are limited. They've dug themselves into a hole and there's no getting out without rustling some feathers. All the features (visible or under the hood) have both led Windows to consume more and more resources as well as jeopardized security. With Windows 7 Microsoft seems to have the right attitude. They aren't supposed to add more bells and whistles to this new release but instead focus on performance and resource consumption. Also, they're revamping the infamous User Account Control. Future will tell...

Sunday, January 25, 2009

Apple without Steve

With Steve Jobs taking a leave of absence to treat an undisclosed health problems, everybody wonders about Jobs' actual health, if Apple will need to continue without him sooner rather than later, and if Apple can survive its CEO and co-founder.

Some claim that Apple has a lot of very talented people and will do fine, thank you very much. Others claim that Jobs is too unique a CEO to be replaced.

I personally think Apple cannot survive Steve Jobs in the long run, and I have several reasons.

For one, highly narcissistic leaders like Jobs aren't the greatest when it comes to succession. They cannot cope with the fact that someone else might do as good a job as them - let alone a better one. If Steve is notorious for being a control freak it means he doesn't trust anybody at Apple to do good enough a job without his enlightened supervision. As a result, narcisistic leaders often secretly wish (sometimes unconsciously) that the company fails after them.

Second, take a look at Apple history. Apple has been doing very well when Steve was CEO, and not so well otherwise. From the date of its creation in 1976 to 1985 where Jobs left, Apple did very well. It made a killing with the Apple II and revolutionized the computer industry with the Mac. Then, after a power struggle between Steve Jobs and John Sculley, the former was forced out of Apple. Not too long after started a long spiraling descent for Apple, releasing uninspiring products like the Apple Newton and less and less inspiring Macs. Even some professionals in the photo industry, one of Apple's strongholds, started to switch to Windows NT. Three CEOs succeeded Jobs and none of them were able to alter the trend. They lived off the Mac's legacy without introducing anything really new that would revive the brand. In 1997 even Steve Jobs, then asked to join back Apple as an "interim CEO", didn't believe in the company he co-created (he immediately sold all of his stock but one even though they were at an all-time low). But Jobs did turn the company around. He was able to make the Mac glamorous again, and Apple released two killer products: the iPod and the iPhone. Apple now is in a great position buzz-wise, operation-wise and cash-wise.

So something tells me that when the board appoints the next CEO they will do as crappy a job as they previously did.

To be fair, Jobs wasn't the best CEO in 1985. He was seen as unruly and clearly lacked operational effectiveness (and why would he? Apple quickly had so much cash at hand back then). One of the first tasks of John Sculley was to streamline the company. Steve later learnt the importance of sound operations when he started his own company NeXT and when he led Pixar, purchased from George Lucas, as he didn't have the luxury of Apple's treasure chest. As a result, Jobs was much more well-rounded when he came back at Apple and streamlined his company even better than any of his predecessors.

But Jobs has always had several rare traits, even in the 80's. Besides having a gift for generating buzz (the famous Steve Jobs reality distortion field) and being a world-class negotiator, Steve has always been a visionary who has been able to make his visions come true. Under Jobs' guidance Apple released several killer products: the Apple II, the Macintosh, the iPod and the iPhone. His company NeXT created what is now the much-admired Mac OS X. And he turned Pixar from a small company into a box-office champion that came to Disney's rescue. People should consider themselves lucky if they lead the creation of *one* killer product. Steve led several.

Sure, it's Steve Wozniak who designed the Apple II and it's Xerox which invented the graphical interface. But it's Jobs who made those projects come true, recognizing the potential of the right concepts, wooing the right developers, insisting on aspects such as look, etc. Quoting Wayne Gretzky's ice skating analogy, Steve Jobs doesn't skate where the puck is but where it's going to be.

And that's what any Jobs successor will likely lack. Apple COO and acting CEO Tim Cook is operationally very strong, but he's unlikely to be as good a visionary as Steve. Let's face it, if Cook - or anybody else for that matter - could replace Steve then Steve wouldn't have hired him in the first place. Unfortunately, Apple's business model needs a constant flow of visions. You cannot keep selling the same high-margin, cutting edge type of product forever. There will come a time where a new iPod won't wow anybody because the concept has become so ubiquitous. That's why Apple come up with the iPhone, but the same fate will eventually struck the iPhone. Sure, Microsoft has been making tons of money for decades selling Windows and Office, but those two products have stopped making people dream a long time ago. Apple is in a position where if its products aren't sexy they don't sell, period.

So my prediction is that when Jobs is replaced (Apple will likely survive Jobs even if he lives to be 100) it will be the beginning of the end for Apple. Not immediately, but eventually. Remember, Apple lived 12 years without Steve Jobs and was still alive when he came back on board, albeit in dire straits. If Jobs were to go now Apple would be in a much better position than it was in 1985. But I highly doubt they'll come up with further killer products like they're doing now. And that might be what Steve secretly wants.