As a geek I firmly believe that there is no such thing as too much CPU power, main memory, disk space or network bandwidth. Recently however my beliefs were put to a test.
It started with a new 4GB CompactFlash card for our digital camera, an Olympus C5050 Zoom. The CF card was supposed to replace an old 1GB Microdrive to increase both the capacity and the battery lifetime of the camera. What was supposed to be a simple task (open camera, remove card, insert new card) took me hours. After a firmware update of the camera and various attempts I finally found out that the camera only supports the FAT16 file system which is limitted to 2GB. I managed to work around that restriction by manually creating a 2GB FAT16 file system on the CF card. This is not an ideal solution but still gives us twice as much disk space and better battery lifetime.
A few days later I wanted to install a firmware update onto the LO100c remote management card in my server. Unfortunately HP only supports two possible ways to do that:
- Update the firmware by running a program under Windows on the machine itself.
- Create a bootable USB stick with the update program on it under Windows.
As I don’t have Windows installed on my server (or anywhere else for that matter) I tried the second option with my brandnew 8GB USB stick. But the *rude language censored* update program complained that the USB stick was larger than 2GB and refused to write the software to it. I had to borrow my old 256MB stick from Silke to be able to complete the firmware update.
So does this all mean that you can have too much disk space? By no means! There is just a lot of outdated software out there. 🙁