Welcome to the Post PC Era

Great post. I do agree that the new display drives greater useability and a more satisfying user experience. I find myself reaching for my new iPad more because it is such a pleasure to use. Snappy response, great reading quality, and maximum flexibility with networking ( 4g Verizon ), I have a machine that meets 55 to 60 percent of my computing needs. Email, content viewing can be handled by the device. My computing experience is “blended” among 3 devices. Laptop, iPad and iPhone. It will be interesting to see how competion responds. Current proponents of other platforms tout unique App management, “open” systems, minimal app oversight. If that is the best differentiation, Apple will continue to rule post-pc world.

You really should give credit for images that you “borrow” for your post. The macro shots are the ones from The Verge (http://www.theverge.com/) and you really should give credit (especially if you didn’t get permission to use it).

It’s quite smart how Apple has marketed their displays relative to the resolution of the human eye. By emphasizing the ‘Retina’ name, they claim the resolution title, and simultaneously nip any further competition in the bud by ingraining consumers that more pixels aren’t going to be visible anyway, so they won’t bother paying more for it. (Of course that assumes any other manufacturer can even match this display… someday perhaps :slight_smile:

Jakob Nielsen wrote about 300 DPI displays back in 1997: http://www.useit.com/alertbox/9703b.html

This sort of reminds me of the debate over how high you should set the bit rate when ripping CDs. Audiophiles will insist they can tell the difference between 128 and 256, but the average person can’t. It’s great that the new iPad finally has the resolution it should have had from the beginning, but I don’t think it’s exactly a hallelujah moment. What can you actually do with a New iPad that you couldn’t do with an iPad 1 or 2? I would think the practical advantages are pretty much that you can read text a bit faster or squeeze a bit more content on the screen without having to squint. Perhaps photos look a bit sharper. But all of these are incremental benefits, not a revelation.

The real revelation was the difference between the iPad 1 and everything that came before it…

As an old programmer, I read here comments, which reminds me at discussion about the first graphical user interfaces. What was a mouse good for? I could do the same work much faster on my 80x25 screen.

Indeed, I think, the new iPad display is an incredible step forward for computer users. The device feels so natural, I forgot almost, that it is a computer.

But the the post pc device will left room for desktop PCs mainly for content creation. But I am honest, most time I spend in content consumption and not in content creation.

@Takkun the image is already linked to the source

@Axbm great reference. I loved this quote from http://www.useit.com/alertbox/9703b.html

The screen readability problem will be solved in the future, since screens with 300 dpi resolution have been invented and have been found to have as good readability as paper. High-resolution screens are currently too expensive (high-end monitors in commercial use have about 110 dpi), but will be available in a few years and common ten years from now.

Heh, not quite. That was written in 1997, and “in ten years” would have been 2007. Only now in 2012 are we getting the very first truly hi-res screen mass produced screen. But it’s attached to this Post PC tablet, you see…

The Samsung Galaxy Tab 11.6 will have a 11.6" display with 2560 x 1600 WXQGA resolution – and be open. I’m waiting a few more months for that. Because while you’re right about post-PC, you’re not right about the closed world of Apple. Innovation doesn’t play well with boundaries.

can you use it on the beach when the suns out?

So what we really need now is to connect those tablets to the PC to use them as the new human interface. Something like they started doing with iPad and Photoshop.

resolution on most devices is a novelty, not a necessity. The idea of increasing resolution stems from the need to increase screen realestate so more information can be displayed. The current trend of increasing pixel density,( which many manufacturers are doing, not just apple), simply make the display “prettier” but no more useful (unless you look at a lot of photos). You can’t add anymore info to the already tiny 10" screen because of physical limits. Who want text that 1mm high etc.
Secondly as the ipad 3 has shown, it requires a lot more memory, and more processing power to process, just for the display. It need bigger CPU and bigger battery without truly benefit processing power.
Lastly, like many fellow tech enthusiasts, I truly hope that the future is NOT Apple. I don’t believe it is, but what a nightmare that would be.
I still want to run real software, and real multiple displays that are useable from a distance, and have a CHOICE, etc etc
AND I want more processing power than a 10 year old desktop system - 1 Gig CPU with a whole 1 GB RAM - and 4 core graphics. - Wow - my old P4 had more than that!!!
I want 3 to 4 Gig CPU, 4 to 8 Gig Ram and 100’s of graphic processing cores and true multitasking and terabytes of storage etc etc etc

I remember when I first got HD TV - I was amazed at the difference in quality. I’d spend ages regaling everyone I knew about how great it was. However, in reality a lot of broadcast content is still supplied in SD format and I soon realised that when a program was good then I never really cared or thought about “the pixels” because I was engrossed in it. (Probably my favourite TV of the year was an SD broadcast of a Danish series (“The Killing”) - with subtitles).

So, yeah, a great display is nice - but it is content that defines the experience. A funny video of a cat riding a tortoise on YouTube won’t be funnier because it’s in high definition (and, in fact, won’t be in high definition on the new iPad as a lot of content will have to be upscaled). Certain things like readability will definitely be improved, but backlit screens will always be more of a strain on the eyes than low-tech paper and won’t work great in bright light.

Personally, I’d like to see Apple innovate a bit more in terms of the UI. Does a home screen full of icons become a better user experience because those icons are a little bit sharper? Or is the problem that a screen full of static icons is actually awful UI (equivalent to a Windows 98 desktop of an elderly relative packed full of shortcuts)?

I don’t see a move from the PC to a “Post PC Era”. To me it looks as if the computing world constantly tries to figure out what the ration between computers and users should be.

Every ten years or so somebody notices that it is inefficient or somehow not good enough if [only one user uses a given computer|a given computer is used by more than one user] and advocates that instead [many users should use one computer|every user should get his own computer].

In the 60s and 70s we called it mainframe, now we call it cloud. The clients are more sophisticated (the iPads and phones and the like, not the users) and the cloud consists of many networked computers rather than one big one, but the effect is the same: many users use one computer (system) again. (In the 90s the expected shift from many computers to one computer system failed because of a mismatch between user expectations and available connectivity technology. But we did get the Web.)

I think that perhaps the entire history of computers can be explained as the constant struggle to make one computer system support more than one user whenever every user had his own computer and to give every user his own computer whenever one computer system was used by many users. This resulted in more computers and more users because we always added but never subtracted.

“Jakob Nielsen wrote about 300 DPI displays back in 1997: <a href=“http://www.useit.com/alertbox/9703b.html””>http://www.useit.com/alertbox/9703b.html"

This is off-topic, but just too interesting. From the article:

“Use hypertext to split up long information into multiple pages”

That is bad. That is really really bad. He got it so terribly wrong even though he wrote the article at a time when PCs were as badly connected to the net as mobile devices are now.

I absolutely hate loading a Web page for a minute or two (depending on location) and then find that after a minute of reading I have to load another page (and another, and another). The ridiculous habit of pretending that a Web page is like a page in a book or magazine is, imho, one of the worst features of the Web today.

Good thing Jeff’s blog is not like that.

Interesting article, although I’m still not quite sold on the whole “tablet” as a form of computing.

The trouble with high resolution PC displays is that Windows 7 and earlier doesn’t scale up well. You can increase the font size, but that doesn’t make everything scale up, so some things you need to click on are very small. Also, not everyone tests their applications with different font sizes, so you get strange effects with bigger fonts.

Great post! I had read your post about high resolution monitors around 2007. Back then 24 inch screens had resolution of 1920x1200 fast forward 5 years (bloody 5 years!) and we still have 24 inch monitors with exactly same resolution, worse still with this HD marketing gimmicks the resolution has actually come down to 1920x1080 in other displays! Can you fucking believe that. Laptops come with that sore of eye 1336x720 display. I have avoided buying a laptop for last 5 years just because of this. When the rumours were going around about new ipad having retina display, I thought it was more of a wishful thinking rather than anything else but they actually did it. I truly hope this innovation will be replicated across the board as far as displays are concerned. I still cant get my head around when I think that the 10 inch ipad has more pixels than my 24 inch monitor. This has to change.

It’s an interesting article, but I’m not convinced.

Ignoring the current leveraging going on around the environments that these new devices exist in (because that’s arguably a separate though important discussion e.g. I’d never buy anything from Apple), tablets & phones only augment the tech landscape. They don’t replace it. We’re still going to have servers and all that that entails. We’re still going to have legions of workstations for content creation.

We haven’t even begun to see the effects of long term PC use, let alone heavy phone & tablet usage. How long before we see the next set of health effects of working too long around a cramped too-small tablet trying to type long documents for example? ‘oh but I use my iPad for everything; I bought it instead of a PC’. Maybe they’ll call it iPad Shoulder to match the Nintendo Thumb.

It’s lovely that the iPad has that resolution - but is there any user value to it? Communication is still more important than watching high resolution Youtube of a cat. Hasn’t the entire crux of the article been that the tablet & phone aren’t aimed at general purpose computing, but email, browsing, what ‘the general public’ want? Beyond occasional hits like Angry Birds, the gaming market is still firmly on consoles & the PC and for some good reasons. So I’m still not sure I see what the benefit is beyond the wow cool & current gadget factor. I applaud the innovation and maybe it will drive something by itself, but I don’t think the average consumer (which in fairness doesn’t really describe Apple’s target market anyway) would be that concerned.

If I were a betting man, I’d still look to the mobile phone market first simply because more people care about making phone calls than having some very expensive paperweight that has a bigger display, and more people will be able to afford one GSM SIM than two (and so they’re going to have a phone first either way). Internet access just isn’t as ubiquituous as people like to claim, either.

I don’t think these devices are ‘there’ yet. When cheap tablets are being given away in third world countries to help jumpstart their access to the Internet (which happens with PCs), then we’re in the post-PC era. I’m not even going to touch on things like the reuse/recycle value of a general purpose PC vs the number of locked mobiles & tablets that get thrown away each year because the new big thing is out.

Thank you for pointing out this truth which should be obvious, but obviously isn’t. Screen DPI is the one thing that has not improved in the last ten years, despite making such a big difference on usability.

As for “but is there any user value to it?”:

Yes. We can finally do away with all the annoying anti-aliasing algorithms, with all the blurry true-type fonts and ugly pixel-perfect fonts, we can skip calibrating monitor and graphics card to get proper sub-pixel AA, and most of all: Make all the different software do this right.

I spend all my day staring at a relatively blurry big but low-res screen (~26" at 1980x1020), and I don’t think it’s doing my eyes a favour. It’s the biggest jump in user friendly-ness we’ve seen in a decade. XP was very usable already, Windows 7 is a lot more polished, but it’s essentially the same thing. I’m still using a mouse, I’m still using a (mechanical) keyboard based on an ancient design by IBM. I can’t wait to get a 300 DPI screen, or two.

I’m genuinely amazed there’s not a single comment here about how it isn’t possible/practical to write code on a “post-pc” device.

Bar a few specialised industries, how many people can really do all their work on an iPad?