Consulting, convening, coding, covering new ground, plus occasional commentary.
coding
#Catalyst on @dot_cloud Adding a #PostgreSQL data service. (#Perl in the cloud, Part IIII)
Cross-posted from blogs.perl.org
Following up on my previous post that demonstrated how to get a basic Catalyst application up-and-running on dotCloud in under ten minutes, let’s explore how to take things a step further by adding a database service.
For convenience sake, I’m just going to walk you through the Catalyst::Manual::Tutorial.
However, unlike the tutorial (or most Catalyst tutorials for that matter), we’re going to use PostgreSQL instead of SQLite – and we’re going to deploy the app into the cloud vs. just developing locally (thanks to the magic of dotCloud, which makes it so easy).
Luckily, it looks like the Catalyst::Manual::Tutorial Chapter 3 has been updated with a PostgreSQL-specific appendix, which makes things a lot easier (and means that I can spare you from my terrible SQLite-to-PosgreSQL conversion skills).
Here we go:
-
Following along with the tutorial, we go ahead and add Catalyst::Plugin::StackTrace to the base application module and the Makefile.PL, which ensures it will get auto-installed and built by dotCloud when we push our app. Here’s the commit on Github.
-
Next, we use the Catalyst::Helper script to create a controller for ‘books’ (and a simple test), and update the controller per the tutorial. Commit
-
Then, using the Catalyst::Helper script again, we create a simple view called HTML that will use Template Toolkit as its rendering engine. Finally, we set the component path to let the application know where to find the templates. Commit
-
Last but not least, we create the TT2 template to accompany the /books/list action. Commit
Now we diverge a little bit and head over to the [PostgreSQL appendix]((http://search.cpan.org/~hkclark/Catalyst-Manual-5.8004/lib/Catalyst/Manual/Tutorial/10_Appendices.pod#PostgreSQL) and create our application’s database for managing books. This assumes that you’re familiar with PostgreSQL, have installed the PostgreSQL server and client and the Perl DBD::Pg module.
-
So, working locally for now, let’s create a user for this application and then a database per the instructions.
-
The data file provided by the appendix had a couple of typos, so I fixed this up here. Use that data file and load up your PostgreSQL database and check that everything loaded properly.
-
Next create the some DBIC schema models with the assistance of Catalyst::Helper::Model::DBIC::Schema
This creates the application’s database model files automatically from the database tables and relationships; see this commit.
- Now, with the models auto-generated and some data in the database, we need to enable our model in our ‘books’ controller. Commit
At this point, you can check out your application locally to ensure that everything is running. In fact, this is a good point to mention a Catalyst development trick: If you run the development server with script/appname_server.pl -r
option, the server reloads when you update an application file. Thus, if there’s an error, you can see it right away. I usually leave this window with the server output visible next to my editing window. Good for caching typos right away.
-
Okay, so finishing up, we configure the HTML view to use a ‘wrapper’ (think header, footer, etc.) for our action-specific views, and we add a CSS file, etc. Commit
-
Even though we’re not going to use them yet, to stay consistent with the tutorial, we update generated DBIx::Class result class files for many-to-many relationships. Commit
-
Update the books/list view template to include authors. Commit
Great. That was all pretty straightforward, so let’s deploy this on dotCloud:
- Add the additional requirement DBIx::Class to the make file. (In fact, I forgot a few requirements along the way – typical! – so let’s also add: Catalyst::Model::DBIC::Schema, DBD::Pg, Catalyst::View::TT, and MooseX::NonMoose. Curiously, I thought that MooseX::NonMoose should have been built as a dependency of Catalyst::Model::DBIC::Schema, but wasn’t … so I had to add it manually to the Makefile.)
Okay, now for the fun part, let’s add a PostgreSQL data service to our dotCloud instance by adding a couple lines to the dotcloud.yml file (Commit) as described in their documentation on PostgreSQL. Pretty simple, eh?
Now, let’s deploy these new files to dotCloud (note that our Catalyst application and the new data service are not connected yet) with dotcloud push catalyst .
and watch dotCloud do it’s incredible magic of installing all of the CPAN modules that your Catalyst app needs. It really is magic.
If all goes well, you should see:
Deployment finished. Your application is available at the following URLs www: http://9f385357.dotcloud.com/
Run dotcloud info catalyst.data
and you should see something like:
Now, you just need to connect up your new data service with your app (well, almost, we’ll still need to create the remote database and load it with data). To do that, you can either put the database connection info directly into your lib/MyApp/Name/Model/DB.pm file, or read it from the dotCloud environment.json file.
However, at this point, if you put your dotCloud database connection info into your app your local development version is going to complain loudly and will stop being useful as a way to see what you’re doing before you push the app to the cloud. So, this becomes a good opportunity to get our local environment set-up to be as similar as possible as our cloud environment.
On dotCloud, the database connection information is automatically put into a handy environment.json
file at the root of our dotCloud environment (/home/dotcloud/). So, to make things easy, let’s also create a environment.json
file at the root of our application directory. So, my application root now looks like this:
And I set my local version of environment.json
to match the variable names that dotCloud provides, but with my local connection information, like so:
Okay, we’re in the home stretch now! So, to finish things off:
-
To read these environment.json files, we can just add the handy JSON and IO:All modules to the Makefile. Commit
-
Now we can update our Model::DB file to read the environment.json on dotCloud if it exists, or to fall back to our local version if not.
-
We’re all set now to actually create the database (earlier, we simply created the data service). We’ll do that by running
dotcloud run catalyst.data -- createdb default-catalyst
. Note that this is using version 0.4 of the dotCloud command-line client – future versions might change this format. The.data
targets the command to run for the data service that we set-up (vs. the www service running the app). If that all worked, you should see:# createdb default-catalyst
-
Last but not least, we load the data from our local development environment to the cloud database. There are probably other (possibly better!) ways to do this, but I found this approach straightforward:
su - postgres
and then./bin/pg_dump default-catalyst | ./bin/psql -h XXXXXX.dotcloud.com -p XXXXX -U root default-catalyst
. Obviously, replace the Xs with the sub-domain and port of your data service.
With all of that done – phew! – we can run one last dotcloud push catalyst .
to push the latest changes to into the cloud, install any remaining dependencies, and restart nginx. If all went well, you should see:
-
The Catalyst welcome screen here: http://9f385357.dotcloud.com
-
The database-connected book list here: http://9f385357.dotcloud.com/books/list
Hopefully your PostgreSQL-backed app is now running in the cloud. Hurray!
If you find an error in this post, or have improvement suggestions, please let me know in the comments.
P.S. If your app is not running, the one thing to check that tripped me us is how dotCloud integrates with git. They key take-away is: be sure to commit your changes to git, or dotCloud won’t pick them up! Personally, I found this a bit confusing, and – in the future – I’ll probably use the dotcloud ssh catalyst.www
command to do my dotCloud-specific debugging on dotCloud and then manually bring those changes back into the local version and then commit the changes. Without doing that, I ended up with a lot of unecessary commits in the repository as I futzed about with a connection issue.
Cross-posted from blogs.perl.org. Feel free to comment here, or on the original.
About
Hi, I'm Phillip Smith, a veteran digital publishing consultant, online advocacy specialist, and strategic convener. If you enjoyed reading this, find me on Twitter and I'll keep you updated.
Related
Want to launch a local news business? Apply now for the journalism entrepreneurship boot camp
I’m excited to announce that applications are now open again for the journalism entrepreneurship boot camp. And I’m even more excited to ...… Continue reading
Previously
Even more software ideas aimed at news engagement, reporting or journalistic challenges by #MozNewsLab
From the future
Gone fishing...