4K and the spec arms race
Sticking with the Galaxy Note 3, there’s another aspect where Samsung has missed a trick by attempting to impress on paper: the camera. It’s a very capable camera, but Samsung, perhaps sensing that its merits over, say, the LG G2’s camera wouldn’t be clear enough to prospective buyers, decided to make it capable of shooting 4K video.
Now I love “the latest thing” as much as the next Stuff writer, but this is pure, pointless willy-waving. Firstly, hardly anybody owns a 4K TV or projector required to watch this footage and secondly, who wants to watch 4K footage captured by a (relatively) tiny smartphone camera sensor? It’s going to look dreadful. 4K is something that should be brought into smartphones down the line, when the technology is more widespread.
Again, Samsung’s resources are being funnelled away from areas where real improvement can be made in order to make a product look better than its rivals (the LG G2's camera is in fact better than the Note 3's, despite its lack of 4K recording). And this in turn pushes rivals into a spec arms race in which they feel compelled to waste time and money implementing things like half-baked 4K video capture in order to “keep up”. At a recent smartphone launch’s Q&A session, a fellow journalist bemoaned the newly-unveiled device’s lack of 4K recording to company reps. Does he really feel like any phone in 2013 “needs” to have 4K recording?
Do we really need sharper screens?
The smartphone and tablet world seems particularly beset by this attitude at present: for instance, phone screens were at a point where the human eye couldn’t make out individual pixels three years ago (the iPhone 4’s 326ppi Retina display), yet manufacturers have fallen over themselves to increase pixel densities further. The HTC One’s screen boasts a staggering 446ppi, but how much benefit does the user actually get aside from the warm feeling that he or she owns “the phone with the sharpest screen”?