Category: "Other"

Ansible SSH issues under CentOS

I needed a CentOS 6.5 guest under a CentOS 6.6 host for development and to prepare for deployments.

I could not get Ansible to SSH into the box.

I took the provisioning out of the Vagrantfile and began running the playbook on the command line with:

ansible-playbook main.yml -i inventories/vagrant -u vagrant -vvvv -k


  • Offending key in /home/user - removed offending key
  • sshpass not installed - installed sshpass
  • SSHed in directly using the user (vagrant) that will be provisioning
  • Set the .gui flag to true in the Vagrantfile - This made it easier to see what was happening.

git "Permission denied (publickey,keyboard-interactive)."

If you are getting permission denied when working with git on a remote server using a key, this may help.

First test to ensure the key will be accepted by the remote server.

ssh -v

Look for these lines in the output:

debug1: Next authentication method: publickey
debug1: Offering public key: /home/account/.ssh/example.key
debug1: Server accepts key: pkalg ssh-rsa blen 277
debug1: Authentication succeeded (publickey).

Then check your ~/.ssh/config file. Be sure the user is in the file and matches what worked with the ssh test.


    User git
    IdentityFile ~/.ssh/example.key

Now get back to work.


Amazon S3 Backup

This is the third tier of a backup system, the last resort if everything has been destroyed or corrupted. This script can run on a local machine or elsewhere. I chose to run it locally because the credentials are not on a publicly accessible server. The local machine copies the data from the publicly accessible servers, stores it, then sends it to S3.

The first step is to sign up at Amazon for an S3 account, create a bucket and a user. Limit the privileges for the user as much as possible, for this script, the user needs only the putObject privilege.

The script is written in Ruby. It reads JSON configuration file which contains all the servers, files and databases to be backed up.

JSON file syntax:

"email": "user@localhost",
"servers": {
"": {
"login": "username",
"password" : "password",
"databases": [ { "name": "database_name", "dbuser": "user", "dbpass": "password"} ],
"files": ["backup.tgz"] }
"s3": {
"bucket": "",
"username": "user",
"accesskeyid": "-- S3 Access Key Id --",
"secretaccesskey": "-- S3 Secret Access Key --"

Each server can include multiple databases and files. Be sure to limit the privileges for this database user to SELECT and LOCK TABLES, which makes them effectively read only. Be sure to grant remote access to the database for the backup server.

The files are to be placed in a directory where they can be retrieved with wget - in the example above it would be The intent of these files is that they contain content already publicly available. This is NOT a place to put the application configuration settings.

Each server will have a hierarchy like this:
|-- initial.tgz
`-- 20140101093022
|-- backup.tgz
`-- database_name.sql.tgz

Create initial.tgz manually - run the tar command at the top of the account, download it to your local machine, then upload it to S3. If you want to get it to S3 from the server, that's fine, just be careful not to ever leave your S3 credentials on the source server.

This is the backup script. It uses wget to get the files (you can use scp, but then you may have a credential issue), and dumps the database.

#!/usr/bin/env ruby

require 'json'
require 'net/smtp'
require 'rubygems'
require 'aws-sdk'

class ItemStatus
	def initialize(item_name, exit_status, ls_file)
		@item_name, @exit_status, @ls_file = item_name, exit_status, ls_file

	def name
		return @item_name

	def error
		return @download_exit_status != 0 

json ='config/.json')
parms = JSON.load(json)

if parms["email"].nil? || parms["email"].empty?
	to_email = "user@localhost"
	to_email = parms["email"]

s3 =
  :access_key_id => parms['s3']['accesskeyid'],
  :secret_access_key => parms['s3']['secretaccesskey']

backup_dir = "servers"
bucket = s3.buckets[parms['s3']['bucket']]

backup =
parms["servers"].each_pair {|server_name, server|
	puts "Server: #{server_name}"
	if !server.empty?
		date = `date "+%Y%m%d%H%M"|tr -d "\n"`
		dir = backup_dir + "/" + server_name + "/" + date
		mkdir = `mkdir -p "#{dir}"`
		if $?.exitstatus === 0
			dir_created = true
			if !server["files"].nil? && !server["files"].empty?
				files = server["files"]
                                if (files.length > 0)
				        if !server["login"].nil? && !server["password"].nil?
						files.each {|file_name|
							dir_file_name = "#{dir}/#{file_name}"
							Net::SSH.start("#{server_name}", "#{server["login"]}", :password => "#{server["password"]}") do |ssh|! "#{file_name}", "#{dir_file_name}"
							`ls -l "#{dir_file_name}"`
							backup.push("#{file_name}", $?.exitstatus, `ls -l "#{dir_file_name}"`))		
						files.each {|file_name|
							dir_file_name = "#{dir}/#{file_name}"
							`wget -q http://"#{server_name}"/"#{file_name}" -O "#{dir_file_name}"`
							backup.push("#{file_name}", $?.exitstatus, `ls -l "#{dir_file_name}"`))		
			if !server["databases"].nil? && !server["databases"].empty?
				databases = server["databases"]
				if (databases.length > 0)
					databases.each {|db|
						dbvalues = db.values_at("name", "dbuser", "dbpass").delete_if {|v| v.nil? || v.empty?}
						if dbvalues.length === 3
							dir_file_name = "#{dir}/#{db["name"]}.sql"
							dump = `mysqldump -C #{db["name"]} -u"#{db["dbuser"]}" -p"#{db["dbpass"]}" -h"#{server_name}" > "#{dir_file_name}"`
							backup.push(["name"], $?.exitstatus, `ls -l "#{dir_file_name}"`))
							tar_file_name = dir_file_name + ".tgz"
							tar = `tar czf #{tar_file_name} #{dir_file_name}`
							backup.push(, $?.exitstatus, `ls -l "#{tar_file_name}"`))
			dir_created = false
	error ={|item| item.error}
	if error.length == 0
		`find -mindepth 1 -mtime +8 | xargs --no-run-if-empty rm -rf`
To: Me #{to_email}
Subject: #{server_name} backup status


	if !server.empty?
		if dir_created
			msg = msg + "Created #{dir} okay\n\n"
			if backup.length > 0
				msg = msg + "Files\n"
				backup.each {|v|
					msg = msg + "\t" + v.to_s
				msg = msg + "\nColumns\n\t1. Source\n\t2. Exit Status\n\t3. File Information\n"
			msg = msg + "mkdir #{dir} failed"
		msg = msg + "No backup configuration"
	msg = msg + "\n\n\n"
		Net::SMTP.start('localhost', 25) do |smtp|
			smtp.send_message msg,'amazon@localhost', to_email
		puts "Mail send failed"


Finally, create a cron job to run the script as needed.

It is assumed that version control for the code is handled elsewhere. This backup is for data, with an emergency copy of the code. If the code is updated, it must be manually updated.

A note about leaving the password in the config file. I understand it is a security issue. That's why this is running on a local machine. Is it completely secure? No. But it isn't on a publicly accessible server either. Could I spend more time making it secure? Absolutely. Am I going to? Probably not.

Sifting Through Spam

If you are setting up email filters for an account, some useful tactics are:

Display the headers of the emails in the account:

grep -iE "^(subject|from|reply-to|X-Spam-Level):" *

Once you identify messages of interest, you can use more to view them.

If you're using cPanel's filter interface with a RegEx, you can use this to exclude all .eu and .us (and any other) TLDs.


Type it in exactly as displayed. I put it on both the From and Reply-To headers.

Another good rule is to match on the spam score in the X-Spam-Status header, like so:


Emails with a spam score of 3 or 4 are rejected with a message that the sender should use the contact form on the site. Almost all of these will be spam, but for the few that aren't, the sender will have a way to resubmit their message.

Be sure to test to make sure it works the way you want it to.

If you find domains that are clearly just spammers, block them explicitly.

Report spam to, and scams to the organization that's being misrepresented.

CentOS 6.4 VirtualBox with Windows 7 (64-bit) Guest

I have an ASUS laptop with a factory installed version of Windows 7 (64-bit) on the internal hard drive, and CentOS 6.4 running off an external USB drive.

My goal was to use Windows 7 to host as many browsers as possible for testing. For that reason, I needed to be able to have both Windows and CentOS running at the same time.

I used the following commands to map the Windows drive for use with VirtualBox:

As root:

chmod a+rw /dev/sda

As the regular user:

VBoxManage internalcommands createrawvmdk -filename /opt/vbox.disks/windows7.vmdk -rawdisk /dev/sda
VBoxManage storagectl 'Windows 7' --name 'SATA' --hostiocache on

Then I created the virtual machine and assigned it to use the VMDK and booted it up.

1 2 4