Why the New 5K Retina iMac is a Game Changer

On the second day Steve jobs was back at Apple I had a chance to sit down with him and ask him about how he planned to rescue Apple. At the time, Apple was $1 billion in the red and we now know they were only a month or two away from even greater financial disaster. The company had been mismanaged and multiple CEOs had tried to make Macs more like traditional PCs to stay competitive but this was a strategy that almost buried Apple.

So when Jobs came back to Apple, he faced a company thats morale was low and prospects were even lower. When I asked him about what he was going to do, the first thing he told me was he wanted to go back and take care of the needs of their core customers. He defined these customers as the graphics, advertising and engineering professionals that used Apple’s products for all types of photo editing, imaging and video projects. He felt the past CEOs had abandoned them while they tried to make Apple more relevant to a broader audience.

He was aware the Mac had led the desktop publishing revolution and it was deemed the best personal computer for what was a narrow audience of graphics and engineering professionals but a lucrative one if done right. So, as he told me then, the first order of business was to create more powerful Macs and go back to these professional audiences and give them the tools that would make their business projects easier and more profitable for them.

With the new 5K Retina Mac, Apple is fulfilling Jobs’ vision that started back in 1997 and delivering to these pro users what appears to be the best personal computer ever designed for creative professional audiences. While we have had 4K displays on the market for some time now, most of these are just displays. While they have come down in price, they are still just stand-alone 4K displays. What Apple is bringing to market is a 5K Retina Display that is a full Mac starting at $2499, a price even the small-medium sized business (SMB) graphics professionals can afford. More importantly, by moving the display to 5K, it is upping the ante for its competitors and will probably become the de facto standard for graphic based desktop PCs. It will be a game changer.

However, I see this price point as being interesting for a number of reasons. Any graphics or engineering professional would probably buy an upgraded model that has a faster processor and faster graphics co-processor that would push the price closer to $3,000+. But at $2499, this price is SMB and consumer friendly for some families looking at an all-in-one desktop PCs to become multipurpose home computers the whole family shares. In this case, it could even double as a TV that could run OTA 4K TV shows already available on Netflix and will be coming from OTA versions of HBO, CBS and other major networks in the future.

Although laptops dominate the market for personal computers, desktop demand has also been steady as all-in-ones have garnered real interest in many homes because of their utilitarian nature and shared capabilities. Many are set up on kitchen counters or desks in the den or family room where various members of a family can use them to check email, search the Web, do light productivity tasks and even watch videos fromYoutube and streamed over the web. But the option of a 5K all-in-one that could also handle streaming 4K programs is quite interesting and could make it a hit with upscale consumers as a PC and 4K streaming TV capable device rolled up into one.

One of the more interesting questions I was asked by a couple of media folks at the Apple event was whether I thought this 5K Retina Mac could be the precursor of an Apple branded TV. There has been a lot of speculation as to whether Apple’s new TV strategies would include an actual TV or just a souped up Apple TV with new UI and new ways to interact with TV content. My son Ben and I go round and round about this subject and personally I don’t think an Apple branded TV makes sense given the cutthroat competition in the TV market (neither does he).

However, Apple did something very interesting with the design of the 5K Retina Mac. They created a special processor of their own known as the timing controller or T-CON designed just to manage and manipulate each pixel with levels of precision we have not seen so far in TV or PC displays. I have no clue if this could be put into a dedicated Apple TV but, at the technical level, this processor could give them a decided edge in TV designs should they want to go in that direction.

I see the iMac with 5K Retina Display as an important product for Apple and their professional audiences and perhaps a hit with upscale consumers who want a high res monitor that could be used for multiple purposes included watching 4K based streamed TV content. But I suspect the technology designed into this could be used in other products over time and this makes this new iMac a product to watch closely over time.

In Praise of Old-fashioned PCs

.Photo of IBM PC (Wikipedia)

I’m a big fan of tablets, especially the iPad. Altrhough I find myself spending more and more time with a tablet and less and less with a traditional computer, I can’t imagine getting by without a Windows PC or a Mac. And that is why, though the market for traditional computers will shrink, they aren’t going away.

The tablet is the only computer a lot of people will ever need. If the iPad or an Android tablet isn’t quite up to the job, the new, more PC-like Microsoft Surface might well be (See Patrick Moorhead’s post on Surface’s advantages.) But a lot of people falls well short of all people.

When he introduced the iPad in2010, Steve Jobs famously observed that that PC was like a truck and the iPad was a car, and most people don’t need trucks. He was right, but seriously underestimated the importance of trucks. Nearly half of all vehicles sold in the United States are light trucks. Even if you eliminate the more car-like crossover SUVs (maybe those are the Surfaces), trucks still account for about a quarter of the sales.

I’m writing this on a Windows desktop PC (for a change; I usually work at my iMac  when I’m at home.) Because I can. I’ve done WordPress posts entirely on the iPad (with a Zagg keyboard) and while it is quite possible, it isn’t much fun. I regularly work with multiple windows open and often cut and paste material from one app to another. You cannot easily do that on a tablet.

There are three activities that keep me on the traditional PC. I do a lot of technical writing and editing, which generally involves large (100-pages plus), highly formatted Word documents. There is no alternative to Word, and often Excel and PowerPoint for collateral material. A lot of tech pundits who keep predicting the imminent demise of PCs and heavyweight Microsoft Office applications underestimate how deeply these are ingrained into enterprise workflows.At the recent Apple product announcement, the thing I found myself lusting for was not the featured iPad mini but the new 27-in. iMac with a Fusion drive.

I also do some video editing. Not a large amount but enough to know that I want the fastest, most capable system I can lay my hands on. Even simple editing is taxing on a system and transcoding and rendering video can get really time consuming. Also the process of capturing an hour of live video and editing it down to a five- or ten-minute cut can generate many gigabytes of files.

Finally, there’s photo editing. I love the hands-on aspect of photo editing on a tablet and iPhoto for iOS is a fun tool. But for serious work, whether it is preparing graphics for Tech.pinions posts or processing my own photos, I turn to Photoshop.

While we are talking about threes, there are three things that PCs have and tablets lack. First is processing power. Today’s tablets have plenty of power for the tasks they are intended to do, including rendering HD video. But to achieve 10 hours of battery life in a very thin, light tablet, thingsa have to go, and one of those things is raw computational power. There’s no way an ARM chip or even an Intel Clover Trail Atom is going to match the performance of an Intel Core i5 or i7 with Intel’s latest integrated graphics, let alone with a discrete graphics system.

Second is a big display. Some tasks, especially those involving multiple windows, want all the display real estate you can throw at them. I generally work with 27-in. displays and am thinking of going to dual monitors if I can figure out how to make them fit. A tablet limits you to one smallish window (one and a half, sort of, on the Surface.)

Finally there’s storage. I haven’t taken an inventory lately of how much storage I have connected to my local area network, but it’s more than five terabytes, with a terabyte of local storage on my two main desktops. A tablet offers 64 GB, max. Yes, there is all but unlimited storage in the cloud, and I keep a lot of stuff in the cloud. But I want local copies of my important content, and that includes lots of music and photos, as well as thousands of documents.

For all these reasons, my PCs aren’t going to disappear. And neither, I suspect, are an awful lot of others. (On the other hand, I do find that I am using my Mac and Windows laptops less and less, as tablets take over the mobile chores.) Many business users are going to continue to need full-bore PCs as well, although there too we may see fewer laptops and a return to desktops.

At the recent Apple product announcement, the thing I found myself lusting for was not the featured iPad mini but the new 27-in. iMac with a Fusion drive. I love the super-portability of the tablet, but I still need the heavy iron too.

Apple as Innovator: Four* Contributions That Changed Computing

Reading the comment threads on Tech.pinions’ many posts on Apple v. Samsung and iOS vs. Android, I have been struck by the recurring charge that Apple is nothing but a clever marketer that does nothing but copy (impolite version: steal) and repackage the work of others. To anyone knowledgeable about the history of the industry, this is pure nonsense I’m not sure that evidence will do much to persuade the doubters. Nonetheless, here are three critical Apple innovations that reshaped the tech industry:

LaserWriterDesktop Publishing. The laser printer was invented by Xerox in the late 1960s and developed in the 1970s by Canon, Ricoh, and Hewlett-Packard. But  in the mid-80s, nearly all “letter quality” printers relied on typewriter technology. Apple had the vision to combine the capabilities of the laser printer, the new Macintosh, and Adobe’s PostScript page-description language to put something resembling professional page composition on the desktop. Apple’s LaserWriter printers were not terribly successful and the company decided to leave printing to HP and others after a few years. But Apple’s early commitment to the technology set the stage for the desktop publishing revolution that not only made gave every computer user the tools of the graphic artist but revolutionized commercial publishing.

 

 

iMacThe legacy-free computer. In 1998, every computer was expected to have a floppy drive. Windows PCs came with PS/2 ports to connect a keyboard and mouse, a parallel port for a printer, and a serial port for other chores, such as syncing a Palm Pilot. Macs replaced those connectors with the proprietary AppleDesktop Bus and LocalTalk ports. That spring, Steve Jobs, who had just resumed the helm of Apple, introduced the original iMac. In addition to looking completely different from any computer anyone had ever seen, the iMac dispensed with both the floppy and all legacy ports, replacing them with Universal Serial Bus connectors. USB had been around for a while and was standard equipment on all Intel motherboards, but since Windows didn’t reliably support USB until Windows 98 Second Edition in mid-1999, they were barely used. Apple, which was still in very shaky financial condition, got scathing criticism for its leap into the future. But while floppies and legacy ports persisted on Windows machines for years, the iMac was a runaway success and suddenly all those indispensable legacies became dispensable indeed. (The iMac’s USB “hockey puck” mouse was a less brilliant idea and was soon replaced by a more conventional design.)

 

 

AirPort iconWi-Fi. No, Apple didn’t invent Wi-Fi, or, as it was originally called, wireless Ethernet. That honor goes to AT&T (later Lucent) Bell Labs. But Apple began putting AirPort cards (actually rebranded Lucent Orinoco PCMCIA cards) into Macs, including some desktops, in 1999, before the IEEE even completed the agonizingly slow process of ratifying the 802.11b standard that it was based on. (Here’s a 1999 article I wrote on Apple’s offerings.) Slower versions of the 802.11 standard had been around for a while for commercial and industrial use, but Apple took the blindingly fast version (a theoretical 11 megabits per second, up from 2 mb/s) and turned it into a consumer product. Although others, particularly Intel, were later to play an important role in making Wi-Fi ubiquitous, it was Apple that had the vision that freed our computers from their network tethers.

 

 

 

WebKit logoWebKit. It’s easy to foget how awful mobile browsing was before the iPhone. Not only did most devices have minuscule displays, but the browsers on Palm, Symbian, BlackBerry. and Windows Mobile devices were just terrible. The WebKit browser engine that was the basis of the iPhone version of Safari  totally changed the game by bringing desktop-class browsing to a handheld. Even though the original iPhone, which lacked 3G support, suffered from slow connections, it provided a vastly better browsing experience than anything we had seen before. Even better, it’s open source (not entirely by choice; WebKit was based on the KDE project’s open source KHTML) so it is widely used by other company’s browsers, including Google Chrome.

Purists can complain that Apple didn’t invent any of these. But that’s the difference between invention and innovation. And while the cleverness and insights of the inventor are essential, we need the daring and vision of the innovator to move forward. The iMac in particular was an extremely gutsy move by Apple; Steve Jobs bet the company on a novel design and its failure would almost certainly have meant the end of Apple.

Even during Apple’s darkest days of the mid-1990s, Apple remained a remarkably inventive company.  For example, the Newton MessagePad was a failure, but no one can say that it did not break significant new ground. There are many things you can fairly criticize Apple for, but the charge that the company fails to innovate is just plain silly.

*-There are three kinds of mathematicians–those who can count and those who can’t. The original headline said “three.” While writing the piece, I added the section on the LaserWriter, but forgot to change the headline.