Cloud category

GeekNews: A cloud-native app template

gn

GN is a cloud-native app template (seed) masquerading as a simple social news site and an homage to HackerNews. GN includes:

  • python3+django1.7 web app foundation
  • basic templates for class based views and user interactions
  • user registration process incl. TOS acceptance step
  • user profiles & account maintenance
  • integrated local & remote devops workflow
  • ready to scale to real world needs out of the box*

Get it now on Github: https://github.com/thomaswilley/geeknews

Why?

I wanted a simple seed app I could clone for future projects that would allow me to much more rapidly focus on the differentiated, high value parts of each project and much less time on the undifferentiated parts.

Maybe many of us are looking for something similar and so releasing today GN as an example, as a learning tool. While it’s probably unsuitable as is for production use as-is, it’s hopefully a helpful start and adds value right away for you.

Which primary technologies are involved?

Why this stack?

Generally, these are the tools and devops flows I personally work with the most - the habits left over after years or experimenting. The one exception is CloudFoundry with is relatively newer for me - I used CF here though because I think I’m officially at that point where I don’t want to think about servers anymore unless I really have to (continuing the mission to focus on just the differentiated parts!). With a couple live projects, the one running in a PaaS (Google AppEngine in this case) by far is the most productive to maintain. While Azure and AppEngine are wonderful platforms in their own rights, I tend to prefer CF because it’s entirely open and I believe it’s the future of enterprise app development for many of the reasons that led to GN’s creation.

If you’d like to deploy GN on IaaS or another PaaS of course it should be straightforward — and if you wanted to script your IaaS deploy but wanted some examples for those devops flows, you may be able to find some value in my Vagrant/Ansible based clustering template for CoreOS/Fleet.

What can I do with it?

You can clone it and be in production in 5 minutes with your own social news site ready for re-factoring into whatever you’re looking to build; and, you’re ready to support signifiant traffic loads (anecdotally only limited by how much you want to pay for, but not limited technologically). You’ll have a familar workflow already setup along with templates to edit in place, and can start first by focusing on your data model and core value proposition rather than worrying about boilerplate… and then when you have what you need at the core, everything else is fully open and accessible for you to create, extend, edit, and tune as needed.

*Scaling is possible thru out of the box compatability with CloudFoundry and Pivotal WS in particular (disclaimer: I work for EMC, Pivotal’s parent company - however all views expressed here are my own).

Quick scripted local CoreOS clusters

CoreOS is a really interesting project. Basically taking the best ingredients from ChromeOS like OS-level auto-updating, melting the rest down to just the essentials, and adding a few critical goodies —- all in all making a clean modern linux distribution that should cluster extremely well. This is a quick ansible / vagrant script that spins up a local cluster to be able to play around with fleet in particular.

https://github.com/thomaswilley/coreos-cluster

Quick & dirty deployments on Ec2 with Python, Django, Pip, and Git.

As I was working on our wedding website, I quickly started realizing my fiancé and others would need to view it as it went up. And make changes… lots of them.

The idea of one click (key) deploys is becoming something of a religion for me with my hobby projects so why was it any different for our wedding site… so I decided to start from complete scratch and put together an Ubuntu 10.10 server on EC2 to the task of hosting my site with a one key deploy using Git, document the steps, and share them in case it helps you in your efforts. Sure I could’ve used localtunnel or something like it, but where’s the fun in that. I want to try this app live in the same server I plan to deploy it’s beta in, and learn something along the way.

Disclaimer: There are lots of great ways to automate deployments, especially with Python and OSS. Among or perhaps the most notable is Fabric, which is built for just this challenge. I don’t know Fabric well enough to make a simple tutorial of it —- but I do know Git, and I wanted something really simple.

Assumptions: Your app is ready to be deployed. It works locally on the django development server. You know the basics of POSIX, AWS & EC2, Python, Django, Git, and are running a reasonably current Mac or POSIX OS, and have done at least some of this before.

As a simplifying convention, I’ll use “LOCAL MACHINE” followed by steps to be done on your local dev machine, and “REMOTE MACHINE” followed by steps to be done on your remote EC2 instance. In case it’s not clear don’t type anything in <note:> brackets.

Let’s begin…

LOCAL MACHINE

  • Launch a new server instance, in this example: Ubuntu 10.10 via AMI: ami-ccf405a5
  •  Enable SSH, HTTP, HTTPS (and anything else you want)
  •  Create & save (or re-use) your key locally so we can further refer to it as <awsKey>, typically on local path ~/.ssh/<awsKey>

Note: Elastic IPs are nice to use here. They just make life easier for a number of reasons, and particularly if you’re going to put this on a domain later it can be handy depending on which DNS route you choose. In our case we’ll use so our local hosts file (POSIX) can point reliably to a named server to shorten shell incantations…LOCAL MACHINE

  • Edit your hosts file (on path /etc/hosts) to append a line (hopefully replacing the <elastic-ip….> part with the actual ip):

    <elastic-ip.to.remote.server>ec2server

  • Login to the remote machine, “ec2server”

    $ ssh -i ~/.ssh/<aws_key> ubuntu@ec2server

Now it’s time to get some of the basics out of the way and install git…

REMOTE MACHINE

$ sudo apt-get update && sudo apt-get upgrade $ sudo dpkg-reconfigure tzdata <note: to set timezone> $ sudo apt-get install git-core git

Now that the server has the absolute minimum for this exercise we can create our git repository and keep pushing forward. Note here that if you plan to use this for more public facing deployments it’s best to secure the machine at this stage.

$ cd ~/ && mkdir productionsite.git && cd productionsite.git $ git init —bare

Now that the repo is initialized, we’ll be able to add it as a remote on our local dev machine. The naturally obvious thing to note here is that just publishing code to the remote Git repo does nothing to deploy it. For the deployment steps, we’ll use a post-receive hook on the repo. In short in case it’s new information for you, Git automatically ships with a handful of hooks which are simply shell scripts that Git executes when certain actions occur. By creating / editing these scripts (hooks) you can do all kinds of neat things as you take action with the system. For our purposes, we want to simply checkout all the code from the repo into a directory on the remote machine which we can then run a django development server on. All of this can be automated, but for now, let’s just checkout the code and we’ll manually run the development server. 

Continuing on, make a directory to hold the checked-out code from the repo

$ cd /home/ubuntu $ mkdir web

Now that we have our destination for checked out code to /home/ubuntu/web let’s make a quick post-receive hook that does the actions described above. It’s 2 lines…

$ cd /home/ubuntu/productionsite.git $ cat > hooks/post-receive #!/bin/sh GIT_WORK_TREE=/home/ubuntu/web git checkout -f <note: ctrl+d finishes this command>

$ chmod +x hooks/post-receive

That is all there is too it - pretty straightforward right? Now it’s time to go back into your local dev machine and connect your local repo to the new remote, and test everything out. At this point, I’m assuming you have a local project already setup somewhere, possibly already remoted to GitHub or something - at least with Git already initialized.

LOCAL MACHINE

$ cd <path-to-local-project-directory> $ ssh-add ~/.ssh/<aws_key> $ git remote add ec2 ssh://ubuntu@ec2server/home/ubuntu/productionsite.git

Now your remote is named “ec2” and points to the remote machine named “ec2server”. 

On your first push, you will need to setup the repo with the right head information so the first push should be:

$ git push ec2 +master:refs/heads/master

Subsequent pushes you can simply use:

$ git push ec2

Note here that after your first push, your code is published to your remote machine and should be checked out into the /home/ubuntu/web directory that we setup for it. So let’s login and do some validation that the push and post-receive hook work as expected.

 REMOTE MACHINE

$ cd /home/ubuntu/web && ls <note: you should see your code checked out here>

If your code isn’t checked out, a good first place to double check is back in the shell on your local machine. Git will print error messages it received from your ec2 remote repo on the local shell window as a part of the push notes. 

Now you’re done with nearly the simplest possible code deployment setup. You can deploy in one command, but now you need django and all the key software components of your app installed on the server. The reason I install python dev tools at this step instead of at the beginning is to re-enforce the fact that this is a loosely coupled exercise. You really can stay quick & dirty with Git remotes, and use any other tools from there out.

Here forward, I’m assuming you used distribute and did a “$ pip freeze > requirements.txt” to capture all the packges you need. I’ll also assume that requirements.txt is in your django project folder and got pushed to your remote server successfully as a result of the previous steps in this exercise. If you didn’t you want to run the following on your local machine in your django project directory:

$ pip freeze > requirements.txt $ git add requirements.txt && git commit -am ‘adding requirements.txt’ && git push ec2

Let’s move on and install the dependencies on our remote system.

REMOTE MACHINE

$ sudo apt-get install python-pip python-dev build-essential $ sudo pip install —upgrade pip$ pip freeze <note: to see what’s already there and validate the toolchain’s working> $ sudo pip install -r /home/ubuntu/web/<rest-of-path-to-django-proj-dir>/requirements.txt

That’s it, you should have your apps environment installed. We didn’t use a virtualenv on the server because we’re lazy in this tutorial and probably it could be argued reasonably easily that there’s no need when you have a dedicated server for your app. Either way, now it’s time to make sure your app works.

At this point you might have some settings in your settings.py file that are wired to directories on your development machine, or other nuances. It’s a good time to open your settings.py file and make sure you’re ready for the remote machine. Sometimes the easiest way to do this is start the development webserver and take a look at your app - the same command you can use when you’re all done and ready to show off your work as it’s being built!

$ cd /home/ubuntu/web <note: assuming this is where your manage.py is> $ sudo python manage.py runserver 0.0.0.0:80 <note: to run the django development server on port 80>

All done! Hope that’s helpful, feedback welcome.

Setting up EC2 w/django, python, mysql, apache and a new user (draft in progress)

setup an image. i used ubuntu with AMI ID of ami-508c7839. after install i realized there’s a newer version. this is ok for now.

key pair - i made my own then downloaded the key (automatically). let’s call it “aws_admin.pem” for now.

security group - i created one and added https (443), http (80), and ssh (22) privs for “All Internet”. for now, assume i called it “public_internet”

secure the local key, firstly i manually copied from downloads into ~/.ssh/. then run

$ chmod 600 aws_admin.pem

then login via ssh, note the “ubuntu@” instead of “root@” as the Ubuntu AMI won’t have a root user you can use.

$ ssh -i aws_admin.pem ubuntu@ec2-184-73-127-59.compute-1.amazonaws.com

then update apt-get (the ‘-y’ just defaults to yes for all prompts)

$ sudo apt-get -y update && sudo apt-get -y upgrade

awesome guide for the rest of it - especially the setup of a new user: http://www.philroche.net/archives/simple-django-install-on-amazon-ec2/

additional info:

to start apache once you’ve installed it, use the following comment (note: and NOT simply “$ apache2 -k start”)

sudo /etc/init.d/apache2 restart

Next I wanted to setup TextMate to access remote files for editing to make life easier and found this howto using MacFUSE: http://minimaldesign.net/articles/read/remote-textmate-projects

django-nonrel on appengine & setup for file transfers for django newbs (like me)

I read the django-nonrel article recently posted up on code.google and wanted to give django a whirl more officially - beyond the basic views & templates support in GAE out of the box. I had a little hobby app that was small and simple enough to be my test case for making the cutover.

One of the features I needed to get working quickly was a file uploads feature, and I realized after looking through the docs that the django-filetransfers package was just the ticket, and I looked at the example source the team posted. (Kudos to the django-nonrel team, the docs were easy to navigate).

I realized as I went through the build there were a couple steps in the process where I needed to google around so I figured I’d take notes here in case it might help if you’re trying similar might save you some time.

On python2.5 (same as my AppEngine SDK), assuming you already have django-nonrel installed.

  1. Download django-filetransfers
  2. $ sudo python2.5 setup.py install (note: did as a matter of course, step 5 is what mattered for me
  3. copy subdirectory “filetransfers” into my project directory (note: not the app directory)
  4. update the settings.py file for my project to include ‘filetransfers’ inside INSTALLED_APPS
  5. get the rest of your files in place (e.g., upload.html, views etc), some random notes around package imports..
    1. from filetransfers.api import prepare_upload, serve_file
    2. from django.shortcuts import get_object_or_404
    3. from django.core.urlresolvers import reverse (note: important to get the view handlers right for the sample!)
    4. update the view_url’s reverse() parameter to point at your app’s handler (e.g., myApp.views.upload_handler where myApp’s your app’s name)
  6. patch your local SDK as explained here, note: in v1.4 for GaE it was on line 302.

That should do it.