OK, so today I set out to create a very simple backup script (bash) that could be run periodically from the crontab to backup the contents of an arbitrary directory in the filesystem (in this case, my user’s home directory). I was originally planning on doing a simple tar incremental or differential backup script when I stumbled upon a neat little utility called duplicity. It’s apparently been around for quite some time and is in fairly common use, but… it’s new to me!
They say they’re two types of people; those who back up and those who don’t, but today I finally got to test my limits and see how far I could have gotten. I had my bag of goodies with me just in case something like this would ever happen I just didn’t know it would happen to me!
I learned about Clonezilla when I wanted to clone something in Linux but wanted something that would copy NTFS as well, just in case anything happened I could easily image someone’s drive and reimage it back up in minimal time and it so happened to be I used Clonezilla to image Windows XP for someone and it worked out very nicely.
Head over to Clonezilla’s website and download the ISO or ZIP and use whatever method you want to boot from this is fine. I tried my hand at the USB installer and it seemed to work out pretty nicely, downloaded the current live cd from the internet and formatted the USB stick, everything looked good til I booted from the USB stick, it gave me an simple boot error message and I did not have to time to break it down but I will get into later.
After making a bootable CD I fired it up and continued to the next step after hitting the defaults I got up to it asking me what my source hard drive was and my destination hard drive where. After select the correct drives it started to do its thing and needless to say I was pretty impressed, well after about 6 minutes I started seeing logical block failures from Clonezilla, it looked like my drive could not read from a certain block and I knew after 5 minutes of it repeating itself that it wasn’t going to go anywhere, so I looked into Clonezilla again to see if there an option so I tried under the Expert settings and using the -rescue option and the samething happened and I knew I was in trouble, anyone that installs Gentoo by hand can tell you that they would rather not reinstall it again.
After looking online I knew I would have to use dd and copy everything block by block and after it was installed I would have to run fsck on it but I found out Clonezilla has dd_rescue installed on it, which is like dd but does not stop on errors and does not truncate the output file. The command used was:
dd_rescue /dev/sda1 /dev/sdb1
After dd_rescue ran for about 4 hours it did come back with 32 errors which totally to about 16kb, not bad for some bad sectors on the disc! This has saved me from all the data that was on the drive, reinstalling Gentoo back on the system and also has prompted me to go ask John to do a How To on backing up.
After playing with WordPress media and uploading a bunch of images to the website, I found myself asking why WordPress has a 2mb upload limit? Well it turns out that it is not WordPress fault for the upload limit, the problem actually lies in your php.ini and depending on what kind of host you have, you can ask them to increase the limit higher for you or if you have your own dedicated server you can manually change it.
In order to find the correct configuration file, we need to make a php file inside your website.
<?php phpinfo(); ?>
After we find the correct configuration file we need to open it with your favorite text editor, I like to use nano, it was what I started with and I know the in’s and out’s. We are looking for the File Uploads section inside php.ini.
;;;;;;;;;;;;;;;; ; File Uploads ; ;;;;;;;;;;;;;;;; ; Whether to allow HTTP file uploads. file_uploads = On ; Temporary directory for HTTP uploaded files (will use system default if not ; specified). ;upload_tmp_dir = ; Maximum allowed size for uploaded files. upload_max_filesize = 2M
The last line here is what we need to change, you can change it whatever variable you want, I just increased the size by 300% allowing me to now upload zip files past the 2mb limit.
We are finally up and slowly moving along!