14 July 2014

4k verses 1440p and gsync the next big trade off

For the last 3 or so years there has been a classical trade off in the monitor world. Monitors have a variety of different technologies that determine the characteristics and you have had to choose between them. You either chose to get an IPS based screen and got high colour accuracy but at the cost of more ghosting and stuck at 60hz or you could get a TN based screen which had lower ghosting, could do 144hz but lower resolutions and worse colour accuracy. Thus the monitor world was split into professional grade screens and gaming screens. Which as a programmer and gamer meant I always had to choose one or the other.

CES in January showed us that pretty much everyone is now talking about 4k, gsync and freesync panels. 4k (UHD) is all based on an enhanced version of TN and so are the gsync panels. I don't quite know what happened to colour accuracy in all this but the manufacturers either managed to make TN produce a good colour image or they just stopped bothering worrying about it at all. Suffice to say a lot of graphics professionals are not going to want any of the 4k panels on offer when they don't produce print quality matching colour. Windows applications still aren't ready for the scaling necessary for 4k panels, even chrome doesn't yet cope with it and so I don't think Windows 8.1 ecosystem will be the one that 4k succeeds in, the software isn't ready. You could buy one and find out for yourself how annoying icons 1/4 of the size in some applications but not others is, but I think its best to wait until Microsoft and the software companies have got everything working better.

Gsync is a great advance for gamers and movie watchers because its going to fix the classic cadence problems both these applications have. Its looking like its going to release in the next two weeks properly rather than the module release that happened in January. Most of the gsync monitors are going to be 1080p and carry a big premium and likely aren't any more advanced than a current TN 144hz monitor with gsync added. Gsync is likely worth it on its own for a gamer but they really don't fix the classical trade off between professional screens with higher resolutions and refresh rates and blur, its firmly in the gaming camp. Asus however has the ROG Swift PG278Q coming out on the 28th July (if the scan.co.uk preorder page is to be believed). Its a 27", 1440p gsync based monitor, with 144hz and 8 bit colour. No TN screens have 8 bit colour, its always been 6 bit + FRC at high refresh rates and this monitor seems to do it all. If it can get 99% SRGB colour coverage (rather than the 80% or less most TN panels get) its going to look absolutely fantastic for a gaming monitor. The high refresh rate is really nice and then on top of that its 1440p which is about the sweet spot for higher density until Windows applications scale better. It could be the perfect trade off monitor.

Later this year there will be a 4k gsync monitor. The problem is that none of the cable standards today actually support 120+ hz on 4k resolutions at 8 bit colour. This is a real problem because I am sure a manufacturer out there has realised that a lot of gamers would want to play at 4k for the increased fidelity and do so at higher refresh rates if possible, but alas the standards for that amount of bandwidth just aren't ready yet. So the next generation of trade off is looking a little different from before:

1) The old IPS screens with 60hz. 8-10 bit colour and low density (<100 dpd="" p="">2) The new TN 4k monitors with 60hz, 6 bit colour but high density
3) The gsync monitors 1080p with 144hz 6 bit colour, low density
4) The gsync Asus ROG Swift 1440p with 144hz, 8 bit colour, medium density

That Asus monitor just stands in a class all of its own for not only being gsync based, higher resolution and 144hz but also that 8 bit colour panel just tops it off. I can't wait to see some reviews and I really hope they got the colour accuracy, anti ghosting technology and latency right on this one. It might be the first time since LCD displays were released that we have a genuinely low latency decent gaming monitor that also produces good colour quality. I can dream.

24 April 2014

A bug in Amazon Primes video player

Update: Amazon acknowledges its a bug with Silverlight. They didn't say whether they intended to fix it so I asked directly if they intended to and they have not responded. So I guess that is a no, they are not going to fix it.

Ever since Amazon Prime was released I have been unable to play any video. All I get is a grey based static on the screen. It shows all the colours of the rainbow in little pixel dots all over while the sound works fine. I went through the standard troubleshooting and at the end of it I still had the same problem.

After contacting Amazon a delightful member of their support team called me back and we went through some of the same troubleshooting tips and a few others as well. While doing this it occurred to me I have a slightly strange monitor, a Benq XL2411T. It says it supports encryption to the screen so I wasn't too concerned about that, but it happens to run at 144hz. So on a whim I set it to 60hz the picture appeared. I tried 85, 100 and 120 and they all worked. 144hz however fails with static. So I have a reasonable work around even if the pulse width modulated backlight is problematic on anything but 144hz with noticeable flicker. I definitely had the support person stumped!

Now here is the question, how long does it take Amazon to go from bug report via support to genuine bug report and bug fix? It should be easy enough to test (they have hundreds of these types of monitors in their warehouses around the world) and presumably its just a cadence algorithm choice issue since presumably because don't have 144hz frequency as one of the options in their software. I am quite intrigued to see how long it takes to go from report (24 April 2014) to fix (not yet). Will be a lovely measure of their agility.

14 March 2014

My Documents is no longer mine

With the release of Windows Vista the users My Documents directory became standardised by Microsoft along with a My Videos, My Images and My Music. Up until that point Windows had little in the way of user level protection of its file system so programs could write anywhere and they typically wrote any files they produced into their installation directories. But along came User Account Control to improve security and then we started to see change in Windows applications.

The change started slowly, because at the time of release a lot of programs required the user to run them in administration mode for a few reasons. Most wrote preferences to the registry which still worked, but game saves and other bigger files were currently being saved in the install directory which didn't work. With UAC they could not write into their installation directories any more so something had to change. Gradually as the years rolled on companies stopped writing files into the directory of the installation and so the need for admin mode was relegated to installation for most software. Instead they started writing into directories they knew were owned by the user so that they had permissions. Beyond the users home directory we had the defined directories of My Documents, My Videos and My Pictures along with AppData and a few others.

Games developers decided that the best place for their game saves was in the My Documents directory. That presumably wasn't a hard choice, after all they weren't music or videos and putting them in AppData would have been in a hidden directory designed mostly for temporary files. So the games would make a directory under My Documents that was based on either their company name or the Game name and then within save some preferences and the saved game files. The problem is that game saves are not really mine and they certainly are not something I would class as documents. They are actually game saves and while I may have done the action to generate that file (well its mostly autosaves) I certainly wouldn't call it one of my documents. My Documents is now anything but mine, it contains a tonne of auto generated files for a variety of companies stuff. Intels NAS performance testing software puts its performance runs in there,  Visual Studio puts a whole directory of config in there and the list goes on and on. At least Nvidia's shadowplay puts its videos in the My Videos directory and more importantly it does it under the name of the game and not under an Nvidia directory, but its still using one of my directories which contains my things and auto writing in there content that I may or may not want there.

Not surprisingly now My Documents no longer contains any of my documents at all. I store those in a whole other directory I made called checkouts, which includes all git repositories I work on, one of which genuinely does contain documents I have created. Its a pain to access it however because despite the fact I can produce links and a library of its not got the same status as the illustrious My Documents directory. I would like to make checkouts my default directory that comes up when I am uploading a file or loading a document in a application but I can't, they all seem to default to the same default, a default that made sense before all this software started co-opting it for their own purposes. I can't be the only person a little bit incensed that such a core directory contains nothing but produced files that have no where else to go because of UAC. Its no longer mine, its owned by all the software I use and it should probably be renamed "Generic stuff". Its become a dumping ground for everything and anything, but definitely not my documents.