What was Microsoft's original mission?
This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2012/03/welcome-to-the-post-pc-era.html
What was Microsoft's original mission?
Is there a magic DPI which makes it qualify for a “post-pc” era device. Would 1920x1080 on a 10" device be “worse” than an iPad 3?
I think at a certain point, those are just numbers that the general public wouldn’t care about or even notice unless directly pointed out to them. I would simply propose that the post-pc era began with the iPad 1 and that the iPad 3’s higher resolution simply improved upon it.
I hate being put in a position to defend Microsoft, but history demands it.
Have you forgotten the viral Project Origami video from back in the day? Just look at it:
What MS promised, it couldn’t fulfill. But Apple has – and also gone beyond it with that iPad Retina Display.
Probably 250 pixels per inch or higher density for your average screen at reading distance (about 30cm).
I think they augment more than supplant, but we’ll see. Maybe when they add voice a reliable voice to text feature to overcome this major short coming: typing anything of length.
The iPad 4, ya the next one, might be my first Apple computer purchase ever.
I’m not sure that very high resolution displays are, in fact, any part of “computing”, let alone a “deep fundamental improvement” in “computing”. Perhaps you meant that high res displays are a “deep fundamental improvement in computer displays”? As you probably realize, the display only shows you the output of the “computing”. It actually has nothing to do with the “computing” part. If you were referring to the user experience, then there are words that reflect that milieu, as well. I don’t mean to pick nits, but your confusion that equates a display’s resolution with “computing” make it difficult to take your more technical arguments at face value. It’s like saying a better modem improves “computing”.
While I’m quite pleased that Apple is pushing tablet hardware forward, I’m firmly in the camp that apathetically says they “just” improved the display. The higher-resolution display offers a more pleasant experience, but the same experience as with the iPads 1 and 2.
Anyone who didn’t have a use for the iPad 2 won’t develop new use cases due to a higher-resolution display that displays the same amount of content at once, so I don’t place much significance on the development.
So now we get busy figuring out how to make tablets into our developer platform for the early 21st century. I love a good challenge.
“existing HCI research tells us that higher resolution displays are a deep fundamental improvement in computing”
Can you share any references for this? Sounds like interesting reading.
Presumably you had to buy two because in this amazing post PC era operating systems don’t support multi-user accounts?
Dan Booth, I don’t “share” my laptop with my wife. She has her own laptop. If she uses mine she uses Guest or if I’m already logged in it’s not that big of a deal for her to just look something up. Same goes vice versa. The ability to have multiple accounts doesn’t mean you have to share the device.
Now, in Jeff’s case you may be right. I won’t speak for him specifically…
I agree with Jeff. The experience of computing on an iPad wasn’t complete until this revision. Personally, I hated my iPad1 and didn’t see the reason to get an iPad2 as there was no change in screen resolution. Like Jeff, I now have two iPadNew.
i heard today that XBox outsells all other kit and is now ranked as the most common device in the home; presumably this is part of the “existential crisis” you mention…
I lived and worked through the early laser-printer era. 200dpi was a technological marvel but not actually useful for real work. 300dpi was a game-changer: you knew you were looking at computer printout, but you could actually work with it. TeX came into its own, math formulas and all. When 600dpi came in, you could work with the printouts without really noticing that they were computer printouts.
I also had a chance one summer to work with an APS-5 phototypesetter having, I believe, 7200dpi of resolution. The letterforms were absolutely gorgeous, but it was clearly ridiculously high end and far beyond what was needed for usability.
I have seen the new Retina display, and I welcome the new standard. But I will be even happier when we get another factor of 4 up to 500dpi. Pixels rule!
In 2007 I asked where all the high resolution displays were. Turns out, they're only on phones and tablets.
Oh, absolutely. I can’t find a high-resolution computer monitor - anything above 1920 × 1200, at any size/DPI - for less than the price of a whole iPad 3. This is quite insane.
Generally, the closer we get to the printed page, the better we can communicate information. Also, 72 and 100 PPI is pretty awfully far from what I’d even call “close”.
“all the existing HCI research tells us that higher resolution displays are a deep fundamental improvement in computing”
Mind providing some citations on that?
@A Facebook User
Go back to Facebook, kid. And take your nits with you.
And for a humorous take on what happens when a corporation achieves its goal:
I look at those Gmail icons and say “Whoopdeedoo!”. Do I really care that the space on which I pressed my index finger is packed with more pixels? No. I get my email either way. My iPad 1 is just as useful a book reader or Netflix viewer. (Sure the camera is nice but…)
It’s like when you’re reclining in a theater seat watching the latest blockbuster. The great new CGI effects will wow you up to a point, but there still has to be a story. Those super crisp pages of the ebook I’m reading still have to convey some useful content.
What made tablets great, even revolutionary, wasn’t the awesome display. It was a new form of human interface. Fingertips replaced the mouse and the keyboard… well, to be honest I couldn’t have created this post without a keyboard. I agree with Castaa. Once they perfect voice to text, we’ll really be rocking.