GeekNews: A cloud-native app template

gn

GN is a cloud-native app template (seed) masquerading as a simple social news site and an homage to HackerNews. GN includes:

  • python3+django1.7 web app foundation
  • basic templates for class based views and user interactions
  • user registration process incl. TOS acceptance step
  • user profiles & account maintenance
  • integrated local & remote devops workflow
  • ready to scale to real world needs out of the box*

Get it now on Github: https://github.com/thomaswilley/geeknews

Why?

I wanted a simple seed app I could clone for future projects that would allow me to much more rapidly focus on the differentiated, high value parts of each project and much less time on the undifferentiated parts.

Maybe many of us are looking for something similar and so releasing today GN as an example, as a learning tool. While it’s probably unsuitable as is for production use as-is, it’s hopefully a helpful start and adds value right away for you.

Which primary technologies are involved?

Why this stack?

Generally, these are the tools and devops flows I personally work with the most - the habits left over after years or experimenting. The one exception is CloudFoundry with is relatively newer for me - I used CF here though because I think I’m officially at that point where I don’t want to think about servers anymore unless I really have to (continuing the mission to focus on just the differentiated parts!). With a couple live projects, the one running in a PaaS (Google AppEngine in this case) by far is the most productive to maintain. While Azure and AppEngine are wonderful platforms in their own rights, I tend to prefer CF because it’s entirely open and I believe it’s the future of enterprise app development for many of the reasons that led to GN’s creation.

If you’d like to deploy GN on IaaS or another PaaS of course it should be straightforward — and if you wanted to script your IaaS deploy but wanted some examples for those devops flows, you may be able to find some value in my Vagrant/Ansible based clustering template for CoreOS/Fleet.

What can I do with it?

You can clone it and be in production in 5 minutes with your own social news site ready for re-factoring into whatever you’re looking to build; and, you’re ready to support signifiant traffic loads (anecdotally only limited by how much you want to pay for, but not limited technologically). You’ll have a familar workflow already setup along with templates to edit in place, and can start first by focusing on your data model and core value proposition rather than worrying about boilerplate… and then when you have what you need at the core, everything else is fully open and accessible for you to create, extend, edit, and tune as needed.

*Scaling is possible thru out of the box compatability with CloudFoundry and Pivotal WS in particular (disclaimer: I work for EMC, Pivotal’s parent company - however all views expressed here are my own).

Turning The Page on 2014

Zooming way out, 2014 was a good year. Despite the increasingly sensational and manytimes troubling headlines in mainstream media, things are getting better all around us. We’re living 75+ years on average now and the world’s becoming a safer place. I hope we reach a time when we can collectively graduate from needing traditional war and country borders as a tool for power. Something tells me that technological progress and the pace thereof is going to have an important role to play in this graduation.

|| ————— | ————— slate: number of armed conflicts | slate: prevalence of mass killings slate: democracy and autocracy | slate: victimization of children slate: homicide rates | slate: violence against female partners

And from Bill Gates’ 2014 review (gatesnotes.com), there’s a number of general health improvements we’re seeing as well including falling child mortality rates and curing or at least strongly countering terrible diseases like HIV, Ebola, and Polio. These continued improvements contribute to our longer livespans and more abundant lives.

|| ————— | ————— gates: 5yo mortality rate | gates: tipping point hiv gates: fewer pills | gates: polo cases drop

2014 was a banner year for scientific progress (wikipedia.org). And the lines between science & discovery and the humanitarian areas continues to blurr.

We managed to land a probe (Philae) on a moving comet so we could get a better sense of what’s inside and the role comets play within our universal ecosystem.

|| ————— | ————— philae2014:landing 1 | philae2014: landing 2

We’ve got a gen1 planetary robot (Curiosity) helping us explore Mars from afar. To boot, Curiosity’s just confirmed organic materials, interesting environment interactions, evidence of historic water, and more.

|| ————— | ————— curiosity2014: martian landscape | curiosity2014: BOM curiosity2014: hydro topographic | curiosity2014: martian hillside

We like what we’ve seen enough to set a new BHAG for NASA: double down on discovery and bring humans to Mars to continue in person. nasa: go to mars


Looking forward to 2015, here’s to making it our best yet!

Quick scripted local CoreOS clusters

CoreOS is a really interesting project. Basically taking the best ingredients from ChromeOS like OS-level auto-updating, melting the rest down to just the essentials, and adding a few critical goodies —- all in all making a clean modern linux distribution that should cluster extremely well. This is a quick ansible / vagrant script that spins up a local cluster to be able to play around with fleet in particular.

https://github.com/thomaswilley/coreos-cluster

Put your Microsoft Project Plan online with PMOcean.com

pmoceanlanding

Sometimes you just need a simple way to publish your Microsoft Project plan online and invite others to collaborate on it with you. I created PMOcean.com to do just that.

There are plenty of apps to help if you’re looking to build or manage a project plan from scratch. At this point, all the best ones are online, at least in part - even Microsoft Project now comes in an online flavor thru Office 365.

That said, as a delivery lead you many times still find yourself using trusty Microsoft Project locally - maybe you just need something quick and Project’s on hand, or maybe you’ve been using it for a long time and like it better, maybe there’s a missing feature in the online edition - who knows. In any case, it happens, and especially when others on your team are unfamiliar with Microsoft Project, it can be a real pain making that come together - particularly at scale. It’s not uncommon at all for larger projects to have their PMO take on a an administrative tax to help with transparency - they’ll chop the Microsoft Project Plan into Excel sheets, email them around, pester those responsible for updates, collate, etc. If you’ve been around large scale cross-functional/cross-organizational solution delivery in the past 10 years, this drill probably sounds familiar.

PMOcean is intended to cut this administrative tax so you can focus on more important endeavors.

Here’s how it works at a high level - and let’s suppose your delivery rhythm requires weekly snapshots for transparency and you’ve already built your integrated project plan and maintain it under change control:

  1. Upload your project plan to PMOcean.com and invite those responsible to collaborate thru PMOcean if they haven’t already been invited
  2. Remind your team to provide their edits and updates by the end of the week, say 12p Friday
  3. Check into PMOcean.com and review the changes, incorporate updates into your integrated project plan, say by 5p Friday

And Repeat! Really is that simple.

Here’s a snapshot of the editor:
PMOcean Project Editor

Why step 3 instead of a big “export to Microsoft Project” button? I’m a big believer in managing the integrated project plan by hand and going thru people’s updates one-at-a-time - because it gives you as a delivery lead the opportunity to assess impacts, ask the right questions, and get intimately familiar with the status of your program or project before you roll it up and out - ultimately making you a more valuable to the team delivering the work as a result. Your team will appreciate it, especially that you’re doing everything you can to save them time on status.

Not solving world hunger here with PMOcean, but scratching an specific itch. I’d love to hear from you about how PMOcean fits your needs and what you’d most like to see next - please reach out anytime - hope PMOcean can help you deliver great things in 2014!

An afternoon setting up Postgres (9.2) on OS X (10.7, perhaps & higher)

Some notes I jotted down after spinning up Postgres this afternoon. Nothing fancy about the setup here, should be just about enough to get you up and running with the basics.

Installation (ala Homebrew)
$ brew install postgresql
To do anything with it, you actually need to create a ‘database system’ with:
$ initdb /usr/local/var/postgres -E utf8
this directory is basically where the database system lives and is generally used by all control commands (as seen below)

Enable access via /usr/local/var/postgres/pg_hba.conf and incoming connections on tcp via (database system path above)/postgres.conf. These files are heavily commented and worth browsing thru to see how. Ultimately I didn’t need to edit pg_hba.conf at all, but did uncomment the localhost & port variables from postgres.conf to accept incoming tcp connections.

Control & Start-up
start single threaded:
$ postgres -D /usr/local/var/postgres
pg_ctl is a handy controller for postgres
start in the background:
$ pg_ctl -D /usr/local/var/postgres -l postgres_server.log start
check the status:
$ pg_ctl -D /usr/local/var/postgres status
stop the (presumably background) job:
$ pg_ctl -D /usr/local/var/postgres stop -s -m fast
Interact and create Database(s) and User(s)
Postgres’s CLI for interaction is psql. It should already be on your path. Connect psql to the running instance (make sure the Postgres db server is running using the status command above)
$ psql -h 127.0.0.1 -p 5432
(b/c in this setup case, we’re using tcp to connect i.e., in configs steps) and might as well add port param to be doubly sure this all works before wiring into a project). At psql’s prompt…
  • \du   lists users and roles
  • \l    shows all databases
  • \?    shows more psql commands
Create a database and user, and give the user perms on the db
CREATE USER [username] WITH ENCRYPTED PASSWORD '[password]';
CREATE DATABASE [database] ENCODING 'UTF8' OWNER [username];

Wire into a project, django today so in settings.py, remember to set host to “127.0.0.1” and might as well set port to “5432” for good measure (assuming that’s the right one from the postgres configs.

Update: great list of “PostgreSQL Basics by Example” on github from darthdeus to help from here! http://darthdeus.github.io/blog/2013/08/19/postgresql-basics-by-example

NTS: handy python 1 liner to see where site packages are

python -c “from distutils.sysconfig import get_python_lib; print get_python_lib()”

Quick & dirty deployments on Ec2 with Python, Django, Pip, and Git.

As I was working on our wedding website, I quickly started realizing my fiancé and others would need to view it as it went up. And make changes… lots of them.

The idea of one click (key) deploys is becoming something of a religion for me with my hobby projects so why was it any different for our wedding site… so I decided to start from complete scratch and put together an Ubuntu 10.10 server on EC2 to the task of hosting my site with a one key deploy using Git, document the steps, and share them in case it helps you in your efforts. Sure I could’ve used localtunnel or something like it, but where’s the fun in that. I want to try this app live in the same server I plan to deploy it’s beta in, and learn something along the way.

Disclaimer: There are lots of great ways to automate deployments, especially with Python and OSS. Among or perhaps the most notable is Fabric, which is built for just this challenge. I don’t know Fabric well enough to make a simple tutorial of it —- but I do know Git, and I wanted something really simple.

Assumptions: Your app is ready to be deployed. It works locally on the django development server. You know the basics of POSIX, AWS & EC2, Python, Django, Git, and are running a reasonably current Mac or POSIX OS, and have done at least some of this before.

As a simplifying convention, I’ll use “LOCAL MACHINE” followed by steps to be done on your local dev machine, and “REMOTE MACHINE” followed by steps to be done on your remote EC2 instance. In case it’s not clear don’t type anything in <note:> brackets.

Let’s begin…

LOCAL MACHINE

  • Launch a new server instance, in this example: Ubuntu 10.10 via AMI: ami-ccf405a5
  •  Enable SSH, HTTP, HTTPS (and anything else you want)
  •  Create & save (or re-use) your key locally so we can further refer to it as <awsKey>, typically on local path ~/.ssh/<awsKey>

Note: Elastic IPs are nice to use here. They just make life easier for a number of reasons, and particularly if you’re going to put this on a domain later it can be handy depending on which DNS route you choose. In our case we’ll use so our local hosts file (POSIX) can point reliably to a named server to shorten shell incantations…LOCAL MACHINE

  • Edit your hosts file (on path /etc/hosts) to append a line (hopefully replacing the <elastic-ip….> part with the actual ip):

    <elastic-ip.to.remote.server>ec2server

  • Login to the remote machine, “ec2server”

    $ ssh -i ~/.ssh/<aws_key> ubuntu@ec2server

Now it’s time to get some of the basics out of the way and install git…

REMOTE MACHINE

$ sudo apt-get update && sudo apt-get upgrade $ sudo dpkg-reconfigure tzdata <note: to set timezone> $ sudo apt-get install git-core git

Now that the server has the absolute minimum for this exercise we can create our git repository and keep pushing forward. Note here that if you plan to use this for more public facing deployments it’s best to secure the machine at this stage.

$ cd ~/ && mkdir productionsite.git && cd productionsite.git $ git init —bare

Now that the repo is initialized, we’ll be able to add it as a remote on our local dev machine. The naturally obvious thing to note here is that just publishing code to the remote Git repo does nothing to deploy it. For the deployment steps, we’ll use a post-receive hook on the repo. In short in case it’s new information for you, Git automatically ships with a handful of hooks which are simply shell scripts that Git executes when certain actions occur. By creating / editing these scripts (hooks) you can do all kinds of neat things as you take action with the system. For our purposes, we want to simply checkout all the code from the repo into a directory on the remote machine which we can then run a django development server on. All of this can be automated, but for now, let’s just checkout the code and we’ll manually run the development server. 

Continuing on, make a directory to hold the checked-out code from the repo

$ cd /home/ubuntu $ mkdir web

Now that we have our destination for checked out code to /home/ubuntu/web let’s make a quick post-receive hook that does the actions described above. It’s 2 lines…

$ cd /home/ubuntu/productionsite.git $ cat > hooks/post-receive #!/bin/sh GIT_WORK_TREE=/home/ubuntu/web git checkout -f <note: ctrl+d finishes this command>

$ chmod +x hooks/post-receive

That is all there is too it - pretty straightforward right? Now it’s time to go back into your local dev machine and connect your local repo to the new remote, and test everything out. At this point, I’m assuming you have a local project already setup somewhere, possibly already remoted to GitHub or something - at least with Git already initialized.

LOCAL MACHINE

$ cd <path-to-local-project-directory> $ ssh-add ~/.ssh/<aws_key> $ git remote add ec2 ssh://ubuntu@ec2server/home/ubuntu/productionsite.git

Now your remote is named “ec2” and points to the remote machine named “ec2server”. 

On your first push, you will need to setup the repo with the right head information so the first push should be:

$ git push ec2 +master:refs/heads/master

Subsequent pushes you can simply use:

$ git push ec2

Note here that after your first push, your code is published to your remote machine and should be checked out into the /home/ubuntu/web directory that we setup for it. So let’s login and do some validation that the push and post-receive hook work as expected.

 REMOTE MACHINE

$ cd /home/ubuntu/web && ls <note: you should see your code checked out here>

If your code isn’t checked out, a good first place to double check is back in the shell on your local machine. Git will print error messages it received from your ec2 remote repo on the local shell window as a part of the push notes. 

Now you’re done with nearly the simplest possible code deployment setup. You can deploy in one command, but now you need django and all the key software components of your app installed on the server. The reason I install python dev tools at this step instead of at the beginning is to re-enforce the fact that this is a loosely coupled exercise. You really can stay quick & dirty with Git remotes, and use any other tools from there out.

Here forward, I’m assuming you used distribute and did a “$ pip freeze > requirements.txt” to capture all the packges you need. I’ll also assume that requirements.txt is in your django project folder and got pushed to your remote server successfully as a result of the previous steps in this exercise. If you didn’t you want to run the following on your local machine in your django project directory:

$ pip freeze > requirements.txt $ git add requirements.txt && git commit -am ‘adding requirements.txt’ && git push ec2

Let’s move on and install the dependencies on our remote system.

REMOTE MACHINE

$ sudo apt-get install python-pip python-dev build-essential $ sudo pip install —upgrade pip$ pip freeze <note: to see what’s already there and validate the toolchain’s working> $ sudo pip install -r /home/ubuntu/web/<rest-of-path-to-django-proj-dir>/requirements.txt

That’s it, you should have your apps environment installed. We didn’t use a virtualenv on the server because we’re lazy in this tutorial and probably it could be argued reasonably easily that there’s no need when you have a dedicated server for your app. Either way, now it’s time to make sure your app works.

At this point you might have some settings in your settings.py file that are wired to directories on your development machine, or other nuances. It’s a good time to open your settings.py file and make sure you’re ready for the remote machine. Sometimes the easiest way to do this is start the development webserver and take a look at your app - the same command you can use when you’re all done and ready to show off your work as it’s being built!

$ cd /home/ubuntu/web <note: assuming this is where your manage.py is> $ sudo python manage.py runserver 0.0.0.0:80 <note: to run the django development server on port 80>

All done! Hope that’s helpful, feedback welcome.

git immersion (http://gitimmersion.com)

Great site for using the venerable Git for distributed version control and better understanding how to interact with github.com, still the best place to collaborate on building great software.

NTS: how to put pyobjc on OSX 10.6+

Firstly I had to sudo easy_install pyobjc

$ sudo easy_install pyobjc==2.2

Then for some reason realized that the thing works online in python26-apple, so

$ sudo python_select python26-apple

Then you can run the instructions here and not run into most of the errors in the comments.

> http://ioanna.me/2009/09/installing-pyobjc-xcode-templates-in-snow-leopard/

NTS: interesting (really simple jquery & django file upload)

http://wiki.phonegap.com/w/page/18270855/Image-Upload-using-JQuery-and-Python