Thursday, 7 February 2013

Sneaky SD cards!

Does anyone know if you can get brightly coloured SD cards anywhere?

Have you ever tried to find a misplaced microSD on a dark surface? It's not very fun. I find it hard enough finding standard sized SD cards, and I've still got pretty good eyesight for a guy who sits in front of monitors an awful lot.

I'm thinking some uber-bright green/pink/yellow cards would do wonders.

As of yet my searches have turned up nothing. If you come across such a thing in your journies, I'd love to hear about it.

Wait! Skip Windows 8.


Windows 8.. Seriously.. Ugh. Why Microsoft? Why???

I recently had the pleasure of assisting someone with a messed up 4G install on their new W8 laptop... I nearly died from the traumatic experience.

This wasn't the first time I looked at or used a W-8 machine, just the first time I've really looked at it objectively.

Seriously people, I don't know how to stress this without sounding like a complete snob/idiot/whatever/knowall. This isn't just my silly Microsoft prejudice I'm speaking with here.

Windows 8 is BAD. BAD BAD BAD!
Even it's name suggests to skip with one.. W8.. W-eight. WAIT? What?

I'm not talking about it just being bad for geeks like me, I'm talking about workflow and getting things done.

It's not just user interface aesthetics . It's a major UI meltdown. This kind of thing works (barely) on an XBOX console, but this is not ground changing design, it's ground burying design. Even as a touchscreen interface, this is crap.

The user should not have to know what mouse button to press, where to click or what key to press to get the UI to respond in a way that allows the user to get what they wanted done. Microsofts former "start" button concept at least provided a practical way to interface with the computer.

Microsoft has made the HUGEST mistake of all the mistakes it has made as a company. So bad I believe for the sake of Microsoft stockholders, Steve Ballmer should be sacked for being thoroughly incompetent.

Don't ruin your computer by "upgrading" to W8. Stick with 7, even a microsoft unappreciator like me can admit that ones comparatively good.

Please don't buy a new notebook with W8 on it, unless it comes with a free "downgrade" option, or you're considering putting a decent free OS like Linux Mint on it.

Mac's are also a good choice. As someone who's spent a life time with computers, only avoiding Macs because of the price and excessive shiny.. Let me say this. I wish I discovered how good Mac's were sooner. I've experienced a massive improvement in my workflow since switching over.

Help send a message so Microsoft doesn't make this mistake even worse by thinking it's ok to take a huge dump on our computers. Maybe they will get it right by Windows 10.. who knows. that's only if they don't go bankrupt.

Tuesday, 29 January 2013

It's a little wet in Bundaberg

Not getting much done the last week..
Not out of lack of motivation, it's just been too wet and depressing with the floods and stuff in Bundy.

My house is a good 5m above the peak water line, so we were pretty much safe.
And we were lucky enough to keep power and ADSL.

A large part of town was cut off, and a lot of people are dealing with crap right now.

Hopefully they managed to save some of the rum..

Here's some pics I took, it wouldn't let me upload anymore than these.


The bridge around the corner from my place



Floodwater can be rather grotty

Roads everywhere were cut off

Thursday, 24 January 2013

It's Backup Time!


This is a friendly reminder to Back. Your. Freaking. Data. Up.

Have you backed your files up today? 
Do you do it every day? 
Does your computer automatically do it for you? 
Are you _SURE_ your backups are working properly?

Because if you don't keep regular backups, your puny backup-less computers are all going to die horribly and all your precious unsaved data is going to hell. Well. computer hell anyway. 

They will die, not due to random chance, manufacturing faults, power surges, lightning strikes, Mean-Time-Before-Failure, solar flares or from being physically dropped, but because I said so! Yes I'm an evil b'stard willing hard drive crashes on all you backup-less philistines! :-P

(actually, your harddrives quite possibly might be dying from those other things too, in fact I'm pretty sure my ill-will towards your un-backed up data doesn't have that much sway in its demise. But I still find it funny when it happens, because I warn people all the time and they don't listen.)

If you're currently feeling all smug because you're already prepared, go check your backups are still working, and then go back to feeling smug.. Then you can snigger along with me over the fates of all those silly sausages who don't listen ;) 

The Consequences

Save yourself from shame, embarrassment and dispair in front of the IT geek at the repair store (or your ever-so-patient nerd friend), when he asks where your backups are. And stop getting mad at them for YOUR failure to be prepared, or them not being able to resurrect your data from the unpleasant depths of satans colon! 

How sad would you be if you lost all your precious photos of Junior and the antics of your cat? The home movies of Sally making her first steps, or of her barfing up on Granddad. 

What if you lost all your MP3's along with other piles of preciously hoarded crud scattered over your hard drive?  

Need I even mention how much it would suck to lose that assignment that's due in 3 days?

I'm sure you are all familiar with Murphy's Law (<--a good read by the way) and it's variations. With computer hard drives, you can pretty much absolutely freaking guarantee that all hell will break loose at the exact time when it is absolutely worst for you.

You'd be amazed as to how many extremely computer savvy people I've talked to that still forgot to ensure they had adequate backups, and were forced to do the walk of shame. And yes, that includes quite a few programmers and techies. (admittedly myself a couple of times, I was too dumb to learn the first time) 

It's one thing to know how reinstall your system and even recover files from a dying drive, but another to get back the hours wasted trying to restore that one damn super-important folder you forgot to keep a copy of. 

Time is precious. Disk space is cheap.

Backing up your data is not hard

Now please note that I'm not talking about business solutions, I'm talking home use stuff. But if your computers at work haven't even got the basics, you better take notes here and go have a serious talk to your IT support guys about it.

I'm not going to explicitly show you how to set anything up.. I'm just going to give my best recommendations here, and hopefully encourage some people to take action.

If your still unsure of what to do after this, have a talk to your techie friend/neighbourhood computer whiz/repair store guy for the full story. 

And also note I'm not talking about solutions for massive (but of course legally obtained) "media" archives. It's best to keep these larger files on separate hard drives dedicated to media so they don't clog up your normal backups. I have observed the most common and cost effective practice is to keep a spare copy of these media archives at a mates place... if you know what I mean. ;)

How much space for backups?

A rough rule of thumb is you need enough storage for 2-3 times the internal disk space of each computer you need to backup. So for example, a single computer with a 500Gb drive, a backup disk of at least 1Tb in capacity space would probably be enough for average users. Having more disk space available for backups allows for further backup history.

With bigger drives, you can use some of space on the backup drives for other things, but it's a good idea to partition the backup disk, so you don't go wondering where'd all the space go when the backup software fills it without telling you.

Backup Hardware

There are quite a few good solutions for backups storage, some even come with backup software or other nifty tools when you buy them, not that I've noted any of the included junk was any good.

External USB Storage
If you just have one computer, go get yourself a nice big external USB backup hard drive. (they are damn cheap) At the start of 2013 they came in sizes up to 4Tb.

Brands? My personal preference is Western Digital, but I know many geeks will spit vile hatred at me for suggesting they are acceptable. I've had my own plethora of bad experiences with other brands.

In reality, it's much of a muchness, buy whatever the heck you like and can afford, just don't expect it to last forever.

Windows XP may choke on drives bigger than 2TB. But that serves you right for running a 10 year old unsupported operating system. :-p Upgrading that old computer to Ubuntu or something costs nothing and allows you to run bigger drives.

I use a 3TB WD My Book Studio drive on my Mac, partitioned into 1Tb chunk for backups, and the rest as a data store for crap I don't really care if I lose. I'd have used a cheaper USB3.0 drive if Apple bothered to take standards seriously, but it definitely looks pretty next to my iMac.

Network Attached Storage
If you have multiple computers or just need something more clever,  get a NAS (network attached storage) server and plug it into your router. These are basically small computers that are designed to share data storage space on your home network.

They even have off the shelf external drives with ethernet ports or wireless now. Handy, and surprisingly cheap too. 

There are many brands to chose from, and none I can confidently say is better than another.. I suggest to a quick google for "NAS" and see which ones get the best reviews.

I've noticed some of the cheap single drive network drives don't have the capabilities to allow you to repartition the disk space. This may cause some problem if you want to use them with multiple computers, or want some extra space for other things too.

For the rest of my computers and devices at home I have a DLink DNS-320, equiped with 2x 2TB 3.5" SATA harddrives. It's a bit slow transfer-rate wise, and can't do as much with a decent sized external drive as it should be able to, but it works well enough for what we need here. 

Cloud Storage
You could always spend loads of dollars for extra space on a cloud storage account like DropBox/Google Drive/Amazon/etc. These services are like having a NAS box on the other side of the world, except someone else does the yucky maintenance for you.

Most encrypt your data, and provide a secure method of accessing it. You may need to take a manual approach to backups using cloud services, but I'm sure there is software that automates it. (Ubuntu's native backup solution Deja-Dup, integrates with Ubuntu One cloud service nicely)

I personally find cloud solutions better for "in case of fire" backups, like family photos and other precious documents and stuff. There are plenty of free ones that give enough space for most people for those kind of purposes. (By all means, keep the important stuff backed up at home too!)

Be aware if you keep files on a paid subscription cloud service, and you let your subscription run out, those files are gonna go bye-byes.

Drive lifetime?

Backup drives up and die just like regular harddrives. So it's a good practice to replace them every couple of years.. Your mileage may vary, some have to replace yearly, others get lucky and not so much. 

Old backup drives that are still in working order make good media backup drives.

This is probably the one advantage of Cloud Storage, because if the provider is doing things right, they factor all that into the cost and do the maintenance for you. I guess it comes down to whether you can trust them to take care of your precious backups.

Backup Software

A shiny new backup drive is not as cool as it could be if you don't have some decent backup software to go with it. It's best to have something that automatically deals with this for you.

You may find you only need your home directory backed up, but if you aren't the kind of person to keep all your goodies under one folder, or haven't the slightest clue where all your stuff is, it's best to look for whole system backup options.

Apple
Apple's Time Machine is so gooood and it's built in on all recent macs. 

Make sure to give it a dedicated partition on your new backup drive of at least 2 or 3 times your internal hard drives total space. More if you wanna keep longer history. 

You don't have to partition, you can give it the whole backup drive if you want to, just don't go wondering why it fills the whole 4TB or whatever up in a month or two, leaving no room left over for junk you don't want to store on your hard drive. 

(Partitioning is done in Finder->Applications->Utilities->Disk Utility)

Microsoft
Acronis True Image is good for all you (sad, loney, pretty UI deprived) windows users. From my days as a computer techie, this is one program that just WORKED as intended.

The latest two versions of windows may have enough backup capabilities built in, provided you take the time to learn to use them.

Linux
The majority of modern decent Linux distributions also have backup capabilities pretty much standard. If you don't find one already installed, there is a plethora of backup software in the package management tools. Search your distros software center for "Backup" and pick the one that works best for you.

Ubuntu's Deja-Dup looks pretty cool. (Pity the rest of the Ubuntu doesn't look so cool anymore. Unity sucks! :-p)

If your a geek, rsync your critical folders to your backup drive, but then again, I shouldn't even have to tell you this. In fact, turn in your geek card if you haven't got decent backups setup already.

And if your a software developer and not using a source revision control tool with an offsite server, I laugh at your hideously depressing loss of code. Muahahahahaha. 

In Summary

A properly configured backup solution requires little maintenance or thought. Just check it's actually backing up regularly, and don't forget to monitor your backup drives for errors along with your system drives.  

Do it right this year, and you too can snigger along with us other prepared people next year, at all those poor dumb schmucks who still don't get why backups are so important..

Monday, 21 January 2013

Texture caching OpenGL fonts

As I mentioned in an earlier post, reliable GL font rendering has been a major source of cross-platform headaches for me in SleepyHead.

What works perfectly on Linux will look like crud on a Mac, an will break on windows depending on the off-screen rendering method used.

Big mess, until I got unlazy and implemented a pixmap caching system to try and work around this, and the results were rather surprising.

The boring technical stuff
Previously, I was using either the QGLWidget::renderText() method, or using QPainter to draw text directly to the screen device. As I said, neither method looked good on Macs, and neither allowed for glitch free offscreen text rendering, no matter what Frame/Pixel Buffering method I was using.

In an earlier attempt to improve performance, all graph area text in SleepyHead is queued up for drawing until all the other graphics primatives are finished drawing, then the DrawTextQue function in my graph area gGraphView (QGLWidget derived) dumps this queue to the paint engine.

I'm still using this Text queue, however now, instead of rendering to the GL context/screen device, instead I create a QPixmap/QImage of the dimensions the text would be (using QFontMetrics to get the sizes). I then render directly to the pixmap using QPainter, which is then bound to a 2D texture, and drawn to the screen.

Now if I just left it there, all I've achieved is slowing things down dramatically, as it's creating an extra step for every element of text drawn. (Which does alleviate mac font problems, but still not acceptable)

Enter pixmap caching.

Qt provides it's own image caching solution called QPixmapCache, which would work fine if I wasn't using so much native GL code. It currently lacks the ability to unbind OpenGL textures when purging the cache.

So I wrote my own, which basically does the same thing, but unbinding textures on cleanup.

For each text element that is drawn, I construct a hash key containing the text itself, a text key describing the QFont, a string describing the color, and the antialias boolean setting.  Size and rotation don't matter here.

I search a QHash using this key, to see if there is a prior record and image available for this text item.
The relevant structure looks like this


struct myPixmapCache
{
    quint64 last_used;
    QImage image; // was originally a pixmap.. i'm too lazy to rename the struct
    GLuint textureID;
};

The cache is defined as following hash.
QHash<QString,myPixmapCache *> pixmap_cache;
If the hash key is not found, the text pixmap image is created, then bound to the records textureID, and a new item is saved in the QHash.

Now that the text element definitely is in the cache, so the 2D texture can be blitted to the screen device. I use QGLWidget::drawTexture() to do this.

Then the last_used field is updated with the current timestamp. This is so the cleanup algorithm, which runs at the beginning of this DrawTextQue function, if the pixmap caches memory requirements go beyond a set threshold.

I track all images memory usage roughly using the images width()*height()*(depth()/8);

To cleanup I scan through the pixmap_cache and look for outdated last_used timestamps (about 4 or 5 seconds old) and drop these keys to a QList, then afterwards go through the QList and unbind and delete the Pixmap cache entries, removing them from the Hash.

All this can be seen in qGraphView.cpp/.h

Performance Results
I implemented a switch so this new method can be switched on or off in the preferences, and used the "fps" counter that displays down the right side of the graph view, turned on in Help->Debug Mode.
(It's not super accurate counter, but it will do. it's a guesstimate of the FPS, not the real one, as SleepyHead doesn't have to update every frame unless it's got something to do)

The best improvement came in the Overview tab.. Scrolling over the AHI/usage/etc bar graphs.. The tooltip pops up and cruises smoothly at around 200-300fps. Perviously I was only getting 21fps and it the tooltip lagged behind miserably.  that is at least a 10 times improvement.

Vertical scrolling took a huge boost too, as all the Horizontal and vertical graph text gets redrawn from the cache each update. Much smoother. Went from 60fps to 300-400fps on my mac.

Vertical graph headers only ever change when graphs or the screen are resized, so it helped here too.

All in all, a big performance gain, for not a huge amount of effort.  

Pixmap caching can be used for anything that contains repeated graphics primitives such as lines, points and quads, however text was the biggest performance bottleneck I found in my application

Saturday, 19 January 2013

Olive branch to the ASAA

The following a copy of a letter I sent to the administrator of the ASAA forum apneasupport.org. In the spirit of openness I'm posting it here. It has seemed in the past that things I write to them get misinterpreted easily.

Thursday, 17 January 2013

Coding again.. SleepyHead needs an update.

Feels good to functional again, I'm writing actual code for the first time in a year.
Well, I should say it feels like I'm hitting my head against the wall, but from what I remember, that's fairly normal. :)

Struggling with a couple of annoying Qt bugs on the Mac platform at the moment.

There is a memory leak in QPixmap/QImage which causes crashes in rather silly places due to the way garbage collection is done. Of all the stupid places to crash it's in the Printer code. Consistently inside of the QPrinter::newPage() function, cleaning up graphics resources. Ugh.

To work around this one I think I need to order how pixmaps are dealt with in the printing pipeline so nothing gets garbage collected during printing.

Printing works fine on all other platforms of course.

Then theres the text rendering on mac.. It's been wonky for quite some time.
Have a look at this graph section below..

D day. I didn't sleep very well back then.

It might be considered a cool graphics effect in minor doses, but all text produced by QGLWidget::renderText or QPainter::drawText on a GL device looks like that on Mac.

When I render to a Pixmap, of course it looks beautiful.. Just cropped to the left is some nice straight text, because the yAxis tickers are rendered to a QPixmap when needed to update, and drawn from an image cached as a 2D texture.

I think I might need to try a similar approach with all GL text. Maybe a Pixmap caching system or something. Might even get a performance boost across the board if I do it right.

Also greatly annoying me at the moment is when I try to render text to an offscreen image, it simply doesn't show up no matter what method I use. How rude. This one only popped up on Mountain Lion, because QGLWidget::renderPixmap no longer works. I implemented a FrameBufferObject & PixelBuffer snapshot methods in an attempt to alleviate this, but still no good. At least I've got the Pie chart pack again.

And once again, works fine on Linux & Windows. In fact runs fine in a VirtualBox running Mint Linux on this very iMac. Faster, prettier and crash free too damnit.


Well, that's what's driving me insane codewise at the moment.


These problems are much easier to deal with now I have a Mac. Previously, I nearly worked poor Jimbo (James Marshall) to death testing all my mac code.. I don't know how he didn't go insane. I'm very grateful for his help. Somehow we cobbled together some working sleepyhead goodness on Mac.

Hopefully I can get enough of this sorted to get a new SleepyHead release out soon. A year overdue, but better late than never right?

Rich Freeman has sorted out all the PRS1 60 series patches and commited the code, along with some other bug fixes.. Man am I an idiot for not handing him the commit keys sooner. His work is very much appreciated.

Anyway. Back to work for me..