Post PC – Should We Care?

The term “post PC era” is becoming more and more popular in online discussion. Should we care?

Does anybody else cringe at the term “post PC era” like I do? Wait, don’t go, I’m not going to rant about how the PC isn’t dead and won’t die and blah blah consoles. Rather, I’m asking the question of whether we should care. There appears to be a decline in the sale of personal computers, which some analysts have decided suggests that we’re moving away from desktop PCs or even laptops and into a world of mobile devices like tablets. As these smaller devices become more and more capable, some argue that the traditional desktop PC will die out from general use. A lot of people are trying to suggest that this is a bad thing. Is it really? Actually, is there even a post-PC era on the way or not? Here are my thoughts, because it’s my goddamn site.

PC Sales – What changed?

Latest figures suggest that PC sales have declined. Now statistics can be twisted and read any which way you like, and one possible conclusion is that people are shunning PCs in favour of tablets like iPads or the millions of Android tablets. You see that monstrous shit that Asus comes out with? God damn, tablets as big as your goddamn monitor, how is that practical? Oh, never mind, different topic. I’d suggest a different possibility: we don’t need to upgrade. Back in the 90s and early 2000s, upgrades came thick and fast, and they were practically essential. You always needed more RAM, you needed a faster CPU, you needed more HDD space, you needed a better sound card, and for gaming we needed increasingly powerful GPUs. If you wanted to keep up with the cutting edge you were stuck on a 6 to 12 month upgrade treadmill with no end in sight. There was never a good time to upgrade, because whenever you bought something, it was already obsolete. Weird Al said it best in his song It’s All About the Pentiums – “My new computer’s got the clocks, it rocks, but it was obsolete before I opened the box!” It sounds absurd but god damn, it was true. And tech wasn’t real cheap back then either, it was expensive, so if you wanted to play, you had to keep up.

In gaming the situation was even worse. Gaming tech exploded with the popularity of home personal computers, and we had an obscene leap in tech in such short periods. In 1992, we had a raycast single layer maze with Nazis. By 1993, barely a year later, we had sector-based engines with different floor and ceiling heights and lighting and stuff! In 1996, just three years later, we had full 3D environments with Quake. From there things just got faster and faster. Back in the early 90s you didn’t need a fancy GPU. A video card just needed to support the various video modes and that was it. By the late 90s software rendering was dying, and Direct3D and OpenGL were big deals. So was Glide, the hardware acceleration library from 3Dfx, who made the Voodoo series of GPUs. If you had a Voodoo card in the 90s, you were hot shit. Well at least until NVIDIA came along and stomped all over 3Dfx… though the later Voodoo cards certainly didn’t do much to help them. The early 2000s were nightmares for upgrade cycles too, because just about every goddamn year a new version of shader model was released, and games started requiring it.

Now we come to 2013, and the upgrade cycle is dead. It hasn’t slowed down, it hasn’t stayed the same, it’s dead. Even in gaming. The last massive change was when we stopped using single-core CPUs and moved to multiple cores. Since then, RAM has become incredibly cheap to the point where most systems have it in abundance, many applications and games still aren’t making good use of multi-core systems, and the most significant change to storage has been the introduction of affordable SSDs, one of the few areas where upgrades still make sense. Otherwise your computer from 2 years or more in the past is still just as viable as newer hardware for most applications. Hell, if you’re just using general office applications, you could go back 5 years to 2008. An i7 920 bought in 2008 is still a powerful CPU (and it cost a fortune in 2008, I know because I had one). So what changed? Why did the upgrade cycle die? Why is it that even with DirectX 11, DirectX 9 is still the standard supported at a minimum by practically all games?

For gaming, we can probably thank consoles. This console generation has been awfully long and proved very popular, which has slowed the adoption of new tech on the PC. When the next generation comes out (probably this year) you can expect the upgrade treadmill to pick up a bit of speed. But if the same story repeats, then we’ll fall into a comfortable system profile and stay there with only minor upgrades. For basic home use, the fact is that there hasn’t been much movement in the home server for a while now. Video editing and processing for mobile devices probably remains the biggest performance demand for home users. Everything else is either done on the internet (making use of server farms and their incredible resources) or hasn’t really demanded much in the way of performance. Running MS Word has never required a powerful PC. The average home PC has power in abundance, particularly with an SSD since a HDD is generally your biggest performance constraint today. There’s simply no reason to upgrade. Windows 8 being a confused mess (its Metro store still hasn’t taken off) didn’t help matters. There’s no reason to upgrade. But is that the only possible cause for a PC slowdown? Maybe. But if this holds true, then it also doesn’t suggest that we’ve entered a post-PC era. Rather, we’re just not willing to upgrade right now.

The Rise of Tablets

The other side to the story is that tablet sales are going up. iPads are hot items. If you go back through the history of DisCONNECT, I laughed at the idea of an iPad initially. Then I caved and imported one from the US, long before they turned up here in Australia. Yes, yes, more dollars than sense, etc. Since then, I’ve always had a tablet of some description, whether it’s an iPad or something else. Right now I have an iPad Mini as well as the W700 (a Windows 8 tablet), and a powerful desktop gaming PC. My dad has an iPad. My mum has an iPad, and the last time she really knew her way around a computer was back in the mid 90s working on an ancient hospital system running MS DOS. In 1997 they were discussing whether to move to Windows 3. Fun fact – my mum worked on a team to design a hospital database system, a derivative of which is still in use today in the public healthcare system, yet she can barely work Google. My mum is an excellent example of why tablets are so popular though – they’re capable of performing many basic computing tasks in a very easy to use manner in an ‘always on’ design which you can carry with you. My mum can work an iPad. She understands apps. Functionally for her, it’s no different to a computer.

My dad is a different case – he’s been using computers since the early 90s and taught me the fundamentals of how to build them all those years ago. Tech moves fast and he got left behind somewhere around Windows XP. He makes extensive use of his iPad for web browsing and some simple apps for getting a few other tasks done. The app system makes sense to him in many ways and simplifies some of the bigger issues he has with navigating a PC (like navigating file systems… no idea why he can’t manage this, it hasn’t changed that much Sarge!). But unlike mum, the old man runs into limitations of the platform from his more advanced use. He gets frustrated with the way that some apps just can’t talk to other apps, or that some file types just aren’t supported. It’s like a desktop until you start trying to multitask or push the system further. Which is where people like myself run into issues – it’s still not as capable as a desktop computer. Yes, I can edit videos on my iPad, but not to the same capacity as my PC. Yes, I can type a document on my iPad… actually no, I can’t, not without an external keyboard, and even then I’d still want to use at least a laptop for that.

But at the same time, even if you don’t like the idea of tablets, it’s absurd to suggest that they’re incapable of being used as an adjunct device to a computer. Some people dismiss them as mere toys, but I think that’s discounting them too much. They’re only going to get faster, and with increased power comes increased capabilities, and this is where the upgrade treadmill is at its height. It’s honestly remarkable that an iPad can do any sort of video editing at all, considering how small it is. With so much of our lives living online these days, a device that acts as a gateway to the Internet is becoming more and more viable. I’m not suggesting we’ll see a resurgence of “dumb terminals” like some sort of back to the 80s event, but it makes working on less capable machines more and more viable. Truth is I can do a lot of things on my iPad that I can do on my desktop when it comes to general computing tasks. I can surf the web, watch Youtube videos, read and answer my mail, read up on my study notes or my textbooks, and even type a document if I’m so inclined (with an external keyboard). If I move to a platform-agnostic application, like using Google Docs, it makes working across the two devices even easier.

A tablet doesn’t replace a PC. It probably won’t ever manage that. But I don’t see tablets as being a limiting factor or a dangerous threat. For some people, tablets are easier to use. Some people prefer the entire ‘there’s an app for that’ mindset of easy access to software which is built for a specific thing, rather than the utilitarian approach of the PC. The cloud is starting to take hold – even things like Facebook are becoming places for people to store all their family memories and photographs, because share all the events! With more data in the cloud, a device like an iPad or a Nexus tablet becomes much more capable, free from its limited storage (though a 128GB iPad is comparable to some laptops) and free from some processing limitations. Windows 8 isn’t seeing much movement in its Metro space, and people don’t like it as a desktop OS, suggesting that people don’t want the traditional desktop experience on a tablet. It’s incompatible with our expectations set by the iPad and the Android tablets. Does this “dumb down” computing? No, not at all. Making things accessible does not equate to dumbing things down, because clearly tablets can’t do everything a desktop can. At least not yet. But is convergence approaching, and should it be feared?

Don’t Fear Synthesis

In 1995, if you’d told me I could carry Doom around in my pocket, I’d have said “That’s awesome!” because I was eight or something. If you’d told my dad he could carry around all his documents on a 7″ device and even edit them, he’d be awestruck. Nobody would be going “Oh no it’s killing PCs it’s dumbing it down burn the tablets!” because it’s an absurd response to progress. Yes, tablets are progress. The assumption that they’re always going to be as ‘limited’ as they are now is ridiculous. Windows 8 already made steps towards merging the two. They’re not good steps but they’re steps nonetheless. Apple ironically have done a better job of merging the two platforms while also keeping the desktop PC or laptop with its strengths intact… though Apple’s tight control over everything enables them to do that much better than Microsoft. The dream of course is for there to be no true barriers, no real distinction between the tablet and the desktop save for a docking station. Right now, right this very second, you could have a Windows 8 tablet which does an okay job of being a tablet that can convert to a desktop simply by plugging it into a docking station. Your tablet and your desktop are one and the same. This approach with Windows 8 fails because desktop apps do not work well as tablet apps, and Windows 8’s mobile offerings are a joke and work horribly on a desktop. But isn’t that an ideal situation? That the two just work together as one device? Why is that an undesirable state of affairs? I wouldn’t need a tablet and a desktop, I’d just have a tablet that does the lot.

Of course there are barriers – a tablet with the processing power of my desktop would be absurd. Hell, the W700 or the Surface Pro are more like ultralight laptops or ultrabooks without fixed keyboards than tablets, because neither of them are comfortable as tablets. But there will still be PCs for that. The PC isn’t going anywhere (including Mac in that statement), we will still need boxes of power to undertake the hard tasks that mobile devices can’t manage. Even when the cloud takes off in full, client-side apps won’t die because they’re just too useful. Trusting everything to a server isn’t possible. But to suggest that tablets are a cancer to be removed, that they’re killing the industry, is just nonsense. This is progress, exciting progress, which we should encourage. That doesn’t mean you have to buy in now by getting 6 iPads and 12 Nexus 7s, but rather that it’s worth watching to see what becomes of it. We weren’t afraid of it in the 90s or the early 2000s, so the fear of tablets taking over seems a little strange to me. But I hope it happens, just so I can carry it all with me. That day is coming ever closer.


Broadcast on this frequency...

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s