Student Web Hosting for High Schools

  • What is it? It's a server set up to give high school students a chance to work on the same technology, in the same environment, as in the web development industry. Students can build web sites and the whole world can see them.
  • Why would you do that? It's real. Offering it as a service allows schools to focus on what they do best - teaching.
  • So what exactly is the service? The service is specialized, value-added Linux hosting. The specialized is that the people running the server are prepared to support students that may need more support than is usually available with a web hosting service. The value-added is an aim to keep costs low, administration simple, and ease of use high for everyone.
  • What courses would benefit from web hosting? Most web development courses will benefit from a web hosting program. If you are teaching HTML, CSS, JavaScript, PHP, MySQL, having a server available reduces the time students spend setting up a development environment. If you are teaching Linux, a server is a great resource because there is no GUI. Students must work on the command line.
  • Can you teach web development without web hosting? Absolutely, students can run the software on their own computers, on school computers, or purchase their own hosting. To learn HTML, CSS and JavaScript, students need only a text editor and a browser.
  • How does it benefit the students? Again, it is real. Students will be working in the same environment they would encountered in the workforce. It also levels the playing field by providing access to these resources for all students, not just those that can afford them.
  • What if a student needs help? There is 24/7 Skype-based chat support available.
  • What about security? The hosting company in use is a world-class service provider. In addition to their security practices, the student server is monitored for ANY content that should not be there. This includes objectionable content, illegal activities, distribution of content that should not be present. Any account that is violating the rules will be suspended.
  • What type of support is there for teachers? Schools will be provided with a 'cheat sheet' which describes how students can put their work on the server and access it. The materials will also be available on the web.
  • Can teachers monitor student accounts? Upon request, teachers can be granted access to the server so they can see how students are using the accounts.
  • What protection is in place to prevent students from copying work from each other?There are methods to hide content and restrict access.
  • Is it expensive? No. Pricing is based on the number of student accounts and the level of support.
  • Do I need a domain name? No. There are several different ways you can access the service, we will work with you to fit it to your needs.
  • What about backups? A three tiered backup is in place - at the hosting company, on a local server, and on a remote server.
  • Can they run applications? Application support varies.
  • Why is this a blog post and not a web page, subdomain, or landing page? Because I don't have time to make pretty web pages. I have time run servers and help people.
  • Sold! Sign me up! Contact me through email - studenthosting -@- wirehopper com.
  • Do you support Microsoft or Windows? No. Microsoft is a great company and Windows is wonderful product, but this service is Linux-based.

eZ Publish - Object Relation Attributes - Reckless Cleanup


This is NOT A FIX for the code, but if you are getting confusing results for queries due to relationships between object relation attributes which were deleted from content classes, you may use this query:

Backup your database first

This is not for the faint of heart

There are no warranties or other guarantees - use this at your own risk.

DELETE FROM ezcontentobject_link WHERE contentclassattribute_id != 0 AND NOT EXISTS (SELECT * FROM ezcontentclass_attribute WHERE = ezcontentobject_link.contentclassattribute_id);

Twitter Application Auth Sample - PHP


This is a sample PHP code which can be used to get a Twitter OAuth token for use in making API calls.

It includes a trends available request that gets the list of countries for which Twitter trends are available.

Be sure to read the documentation at the link above. A given application can only have one token at any given time, so once established, the token should be stored and reused.


$consumerKey '-- YOUR CONSUMER KEY --';
        $consumerSecret '-- YOUR CONSUMER SECRET --';
        $encodedKey urlencode($consumerKey);
        $encodedSecret urlencode($consumerSecret);
        $bearerTokenCredentials $encodedKey.':'.$encodedSecret;
        $ch curl_init();
                array('Content-Type: application/x-www-form-urlencoded;charset=UTF-8',
                        'Authorization: Basic '.$base64BearerTokenCredentials));
        $result curl_exec($ch);
        $error curl_errno($ch);
        if ($error === 0) {
                $json json_decode($result);
                                array('Authorization: Bearer '.$json->access_token));
                $result curl_exec($ch);
                $error curl_errno($ch);
                if ($error === 0) {
                        $json json_decode($result);
                        $countries array();
                        foreach ($json as $location) {
                                if ($location->placeType->name == 'Country') {
                                        $countries[$location->woeid] = $location->name;

git "Permission denied (publickey,keyboard-interactive)."

If you are getting permission denied when working with git on a remote server using a key, this may help.

First test to ensure the key will be accepted by the remote server.

ssh -v

Look for these lines in the output:

debug1: Next authentication method: publickey
debug1: Offering public key: /home/account/.ssh/example.key
debug1: Server accepts key: pkalg ssh-rsa blen 277
debug1: Authentication succeeded (publickey).

Then check your ~/.ssh/config file. Be sure the user is in the file and matches what worked with the ssh test.



    User git
    IdentityFile ~/.ssh/example.key

Now get back to work.


Amazon S3 Backup


This is the third tier of a backup system, the last resort if everything has been destroyed or corrupted. This script can run on a local machine or elsewhere. I chose to run it locally because the credentials are not on a publicly accessible server. The local machine copies the data from the publicly accessible servers, stores it, then sends it to S3.

The first step is to sign up at Amazon for an S3 account, create a bucket and a user. Limit the privileges for the user as much as possible, for this script, the user needs only the putObject privilege.

The script is written in Ruby. It reads JSON configuration file which contains all the servers, files and databases to be backed up.

JSON file syntax:

"email": "user@localhost",
"servers": {
"": {
"databases": [ { "name": "database_name", "dbuser": "user", "dbpass": "password"} ],
"files": ["backup.tgz"] }
"s3": {
"bucket": "",
"username": "user",
"accesskeyid": "-- S3 Access Key Id --",
"secretaccesskey": "-- S3 Secret Access Key --"

Each server can include multiple databases and files. Be sure to limit the privileges for this database user to SELECT and LOCK TABLES, which makes them effectively read only. Be sure to grant remote access to the database for the backup server.

The files are to be placed in a directory where they can be retrieved with wget - in the example above it would be The intent of these files is that they contain content already publicly available. This is NOT a place to put the application configuration settings.

Each server will have a hierarchy like this:
|-- initial.tgz
`-- 20140101093022
|-- backup.tgz
`-- database_name.sql.tgz

Create initial.tgz manually - run the tar command at the top of the account, download it to your local machine, then upload it to S3. If you want to get it to S3 from the server, that's fine, just be careful not to ever leave your S3 credentials on the source server.

This is the backup script. It uses wget to get the files (you can use scp, but then you may have a credential issue), and dumps the database.


#!/usr/bin/env ruby
require 'json'
require 'net/smtp'
require 'rubygems'
require 'aws-sdk'
class ItemStatus
  def initialize(item_name, exit_status, ls_file)
    @item_name, @exit_status, @ls_file = item_name, exit_status, ls_file
  def name
    return @item_name
  def error
    return @download_exit_status != 0
json ='config/.json')
parms = JSON.load(json)
if parms["email"].nil? || parms["email"].empty?
  to_email = "user@localhost"
  to_email = parms["email"]
s3 =
  :access_key_id => parms['s3']['accesskeyid'],
  :secret_access_key => parms['s3']['secretaccesskey']
backup_dir = "servers"
bucket = s3.buckets[parms['s3']['bucket']]
backup =
parms["servers"].each_pair {|server_name, server|
  puts "Server: #{server_name}"
  if !server.empty?
    date = `date "+%Y%m%d%H%M"|tr -d "\n"`
    dir = backup_dir + "/" + server_name + "/" + date
    mkdir = `mkdir -p "#{dir}"`
    if $?.exitstatus === 0
      dir_created = true
      if !server["files"].nil? && !server["files"].empty?
        files = server["files"]
        if (files.length > 0)
          files.each {|file_name|
            dir_file_name = "#{dir}/#{file_name}"
            `wget -q http://"#{server_name}"/"#{file_name}" -O "#{dir_file_name}"`
            backup.push(, $?.exitstatus, `ls -l "#{dir_file_name}"`))    
      if !server["databases"].nil? && !server["databases"].empty?
        databases = server["databases"]
        if (databases.length > 0)
          databases.each {|db|
            dbvalues = db.values_at("name", "dbuser", "dbpass").delete_if {|v| v.nil? || v.empty?}
            if dbvalues.length === 3
              dir_file_name = "#{dir}/#{db["name"]}.sql"
              dump = `mysqldump -C #{db["name"]} -u"#{db["dbuser"]}" -p"#{db["dbpass"]}" -h"#{server_name}" > "#{dir_file_name}"`
              backup.push(["name"], $?.exitstatus, `ls -l "#{dir_file_name}"`))
              tar_file_name = dir_file_name + ".tgz"
              tar = `tar czf #{tar_file_name} #{dir_file_name}`
              backup.push(, $?.exitstatus, `ls -l "#{tar_file_name}"`))
      dir_created = false
  error ={|item| item.error}
  if error.length == 0
    `find -mindepth 1 -mtime +8 | xargs --no-run-if-empty rm -rf`
  msg = <<END_OF_MESSAGE
To: Me #{to_email}
Subject: #{server_name} backup status
  if !server.empty?
    if dir_created
      msg = msg + "Created #{dir} okay\n\n"
      if backup.length > 0
        msg = msg + "Files\n"
        backup.each {|v|
          msg = msg + "\t" + v.to_s
        msg = msg + "\nColumns\n\t1. Source\n\t2. Exit Status\n\t3. File Information\n"
      msg = msg + "mkdir #{dir} failed"
    msg = msg + "No backup configuration"
  msg = msg + "\n\n\n"
  Net::SMTP.start('localhost', 25) do |smtp|
    smtp.send_message msg,'amazon@localhost', to_email

Finally, create a cron job to run the script as needed.

It is assumed that version control for the code is handled elsewhere. This backup is for data, with an emergency copy of the code. If the code is updated, it must be manually updated.

:: Next >>