I have owned a personal computer for over half of my life now (now, that is a depressing thought). My first was a Sinclair ZX Spectrum, which I bought when I was an undergraduate student in Genova, in 1985. After moving to the US, in 1988 I bought my first IBM PC compatible, and for fourteen years that was to me the paradigm of home personal computing. Some (actually very few) of my friends bought Apple computers, and I kept wondering what the hell they were thinking.
Besides being ridiculously overpriced (easily a factor two, often as much as three times more expensive than a comparably equipped PC), to me those machines looked more like nice gadgetry, almost toys, rather than tools with which one could actually get some work done. Most of the things that I did at work, could be easily done on a PC, almost in the same way. The operating system, MS-DOS, was clumsy but somehow reminiscent of UNIX, the one on which I learned computing and have been using to do my work since I started doing research.
Sure, there was also a graphical user interface (GUI) named Windows , but to say that it was bad would be a compliment. Plus, I never saw the point of dragging funny little monsters all over my screen — that was what Apple was predicating, and some people seemed to like that … not me. I liked the command line interface (CLI) of UNIX, that is what computing to me was all about.
In 1993, following the advice of a friend, I finally got rid of MS-DOS and installed on my home PC a free version of UNIX, the legendary Linux. That would be personal computing for me, at home and at work, for the following nine years.
If I wanted to move a file, create or rename a directory (that is what “folders” used to be called) etc., I could do it by typing a command. I could write scripts that would perform complicated tasks, such as merging and archiving files every week, or even reminding me of my deadlines. I could write the codes that I would use to perform calculations, papers to submit for publications, class notes, presentations, letters… I could also surf the web, manage e-mail and all the rest (including games), in a way that was much more stable than on other platforms, and virtually immune from the viruses that plague MS-Windows and make the life of its users a little hell.
There was virtually nothing I could not do. To me, nothing could surpass the power and efficiency of that computing solution — surely not suitable for everyone, but just perfect for someone like me.
Seven years ago, however, I switched to Apple, and have been an Apple user since . What was the reason for making such a strange decision, given all that I wrote above ? I would be a liar if I wrote that the incredible coolness of Apple’s products did not play any role. My finances were in significantly better shape than fifteen years earlier, and I could afford to spend a few extra bucks  for the “coolness” factor. However, that was not what triggered the whole process.
Interoperability with the world of Microsoft software is what drove that move, in the end. Secretaries, colleagues, students, administrators, granting agencies, editors routinely sent me documents typeset with some Microsoft application, expecting me to be able to open, read and modify them. I often could not. Free applications written for Linux, supposedly capable of performing such tasks, simply did not deliver. And, the approach advocated by Richard Stallman (simply refuse to read the document and demand that it be sent in an open source format), while certainly making the point, did not do much for peaceful coexistence and harmonious relations.
As I was reluctantly considering a switch to the hideous MS-Windows, friends of mine suggested that I take a look at Apple’s new operating system, named OS X. I went to the university computer store and played for a few minutes with one of the newer Apple models, sporting that operating system. It was immediately clear to me that that was the way to go. In a move that many of us still regard as surprising, Apple had decided to go the UNIX way itself. In many respects it was like being on Linux, with access to all open source software on which I have come to depend (e.g., gnuplot), but with the added bonus of a GUI much better integrated with the various applications and functional than the one provided by the various Linux distributions  as well as full interoperability with MS software, as all the major applications (e.g., MS-Word) exist for Mac OSX as well.
After seven years, I am quite happy with that choice, and do not wish to go back to PC and Linux, in spite of the premium price that I have to pay for Apple products (yes, it is still more expensive a computing solution). And I can see that I am not alone. Apple products have become hugely popular among scientists. It suffices to go to a meeting of the American Physical Society and simply count the number of Apple versus non-Apple laptops used by attendees . I think the former are now likely the majority, and even if not their share has increased dramatically since OS X was introduced (very few Apple laptops could be spotted a decade ago).
And of course, academic scientists are very effective at publicizing this type of product to the greater public. Students, for example, who constitute a huge market, certainly take notice at what professors use, they ask them what it is that they like about that particular gadget, and it is my impression that they often end up acquiring the same product for themselves .
What have you done for me, lately ?
Which brings me to the following: if a particular company makes products that might enjoy some popularity within a specific segment of the market, should it not actively promote their use by making sure that they be used by key, influential individuals, even at the cost of giving the products for free, or at a much reduced price to those individuals ?
After all, none of the top-1000 tennis players worldwide has to shell a penny to purchase a tennis racket or tennis wear — they get this stuff for free, as the companies that make it expect to derive benefit from the publicity ensuing from GreatTennisPlayer wearing XYZ shoes.
Why do professional scientists, or academics, not enjoy a similar benefit ? Really, how many amateur tennis players (a relatively tiny fraction of the population anyway) go and buy a racket based on which player uses it ? I would think that many more people purchase computers based on the input (direct or indirect) that someone like a university professor has. It seems to me that, in terms of pure marketing strategy, our category should be at least as relevant as tennis players.
So, why is it we only get a dismal discount to buy an Apple laptop, and have to spend extra money for a non-ridiculous warranty ? Why isn’t Apple sponsoring physics conferences, offering scholarships to students and so on ? Why isn’t Apple giving all, say, physics professors, oh I don’t know, a free iPhone (just a random example, I do not know why that was the first thing that came to my mind) ?
What has Apple done for its customers, lately ? OK, they make computers that work great but… as a professional community, could it be appropriate for us to start playing hardball, and try to make the point that we should not be taken for granted ? Not that I would know how to do it but, how can tennis players (who make quite a bit more than us) get away with it ? After all, scientists are much smarter (and some of us way better looking)…
 I shall not put a link to that, er, thing on my blog, sorry.
 I still use Linux for cluster computing, and for that specific purpose I cannot imagine moving away from it in the near future.
 Meanwhile, the price difference between Apple and PC compatibles, while still noticeable, had shrunk significantly
 OK, I admit it, with time I have mellowed down, and slowly moved away from my early CLI-only hard line stand, warming up to the GUI concept…
 I am talking, of course, about those rare moments when we take a break from the talks and check our work-related e-mail for only a few minutes, taking advantage of the wireless internet connection provided. I certainly do not mean to suggest that serious scientists would go to a conference and spend most or all of their time surfing the web… I mean, come on, that would be… so bad… no, no, we do not do that at all. Oh no… no, no, no. Hey, I said no.
 I cannot support this statement with any data other than my personal observation: my three most recent graduate students came to work with me as MS-Windows users. I bought Apple desktops for them to use at work, in order to make sharing of files and software easier (since I use Apple). They all ended up liking that environment so much that they all bought Apple laptops for themselves.