Apple's new A7 processor in the iPhone 5S is apparently twice as fast as the outgoing A6, which is now reserved for the down-market likes of the iPhone 5C.
It's got a gazillion times more gaming power and it has 64-bits. But while the last part of that statement makes a good marketing soundbite, it may not be quite as important as you think.
But surely 64 bits are better than 32 bits, right?
There are huge advantages to using a 64-bit processor rather than a 32-bit one. Just about every PC and laptop in the world today has a 64-bit processor, and as a result they can handle larger programs with more complex instructions and run games with ridiculously detailed imagery. The PC business is a cut-throat one with tiny profit margins – manufacturers wouldn't use more expensive processors unless they had to.
Woohoo, so that means my 64-bit phone is faster than your crummy old 32-bit one, right?
It is, but not because of the number of bits it has.
I don't understand. You just said...
If the A7 processor is as fast as Apple is claiming it is, it's because it's running at a higher clockspeed and has a new architecture based on the ARM v8 design which is more efficient at crunching numbers. Moving from 32-bit computing to 64-bit computing means nothing in terms of an inherent performance boost.
Going back to first principles, a bit is the smallest unit of computing out of which all others are made. And when we say simple, we mean simple. Bits are recorded as either 'on' or 'off', '1' or '0'. You can use them to count in binary and so long as you have enough of them can make any other number up. Having more bits at your disposal means you can make longer and more complex instructions up.
OK, I am not following this at all...
Here's an example you might understand: the move from 8-bit to 16-bit computing revolutionised videogames. 8-bit colour means you only have as many colours at your disposal as you have combinations of 1s and 0s in eight characters: which is 256. 16-bit colour gives you 65,536 different shades, which is why the SNES knocked the Spectrum for six when it came to creating worlds. When the PlayStation came along, with 32-bit processing, game designers were able to create semi-realistic shading that could almost fool the human eye.
So 64 bits means better graphics, better games. Woohoo!
Steady on there – that was just an example. Colour processing at 64 bits is utterly unnecessary and way too computationally intensive for gaming any way – that's more colours than the eye can discern. But we're getting away from the point now.
The point is that increasing the bit depth of the processor allows you to do new stuff, but it has no effect at all on the basic speed of a CPU or its ability to run existing code. And turning all of your old, short instructions into long, complex ones still makes them harder to run.
More after the break...
Why go 64-bit at all?
For desktops and servers, the key benefit of moving to 64-bit processing is the ability to use more memory. There's some funky maths behind this, but the crux is that a 32-bit processor can only see 4GB of memory. Web servers, big data crunching machines and top end gaming rigs need to be able to hold more than 4GB of data in system RAM to run smoothly, so 64-bit processing is essential, even if a lot of the code is still 32-bit.
Yes. Despite more than a decade of 64-bit processing on the desktop, you'd be surprised how many games and other programs are still written using 32-bit libraries. Fortunately, just like desktop processors, Apple's A7 can still run older code without a performance hit.
Then all this fuss about 64-bit is just hype. Nuts.
Don't get too upset. There are some good reasons for the move – we're led to believe the fingerprint scanner in the iPhone 5S requires 64-bit processing, for example. More to the point, what Apple is really saying is that it's adopted ARM's latest v8 chip architecture for its iPhone 5S processor, and that has a whole range of extra features that are directly related to performance. More memory registers, dedicated encryption engines and the like.
But no. Developers aren't suddenly going to start developing 64-bit games that exclude the vast majority of iOS users out there, nor is Apple going to start selling a phone with 5GB of RAM. The key to mobile everything is power efficiency, and that means more elegant code, not longer routines and cruft.
Having said that, all manufacturers who licence ARM's chip designs – Samsung, Nvidia, Qualcomm, et al – are able to use the 64-bit architecture if they wish. True to form, Samsung has already said that its next Exynos is going to be a 64-bit one. To be fair, though, the chances are that by next March all of the new smartphone chips were going to be 64-bit any way.
So... I'm sticking with my 4S then.
OK - here's something to get excited about. 64-bit computing on a phone isn't about what it can do today, but the tomorrow for which it prepares us.
That's not even grammatical, Yoda
Think back to the recent Ubuntu Edge phone, the handset that - had it been successfully crowdfunded – would switch to a desktop operating system when it's plugged into a monitor. See the magic 'D' word there? When it comes to desktop computing a 64-bit processor is very useful, and Ubuntu's vision is the way the whole world will eventually go: one device, one operating system that can do anything. Or, perhaps more likely, the ability to do any of your normal computing on any device you choose. Which leads us to think that the correct interpretation of Apple's move is that it's setting the stage for a time in the not too distant future when the next MacBook Air will be the iPhone 10 - a single, unified operating system on a tiny, tiny machine. Taking bets now.