Thursday, May 19, 2011

Large Hadron Migrator: Update huge SQL tables without going offline

Wynn sat down with Nick Quaranto at Red Dirt Ruby Conference to talk about Gemcutter, RubyGems.org, and how to get started creating your own Ruby gem.

With all the NoSQL hotness out there, believe it or not, some people are still using relational databases. (I know, right?).

When it comes to dealing with schema changes, the Active Record Migrations in Rails make schema changes so easy, developers often take them for granted. However, for extremely large sets of data, running an ALTER TABLE might mean taking your database offline for hours. After considering other projects Rany Keddo and the smart folks at Soundcloud developed their own solution.

Large Hadron Migrator, named for CERN’s high energy particle accelerator, uses a combination of copy table, triggers, and a journal table to move data bit by bit into a new table while capturing everything still coming into the source table in the live application.

To install, configure the gem in your Gemfile:

gem 'large-hadron-migrator'

… and run bundle install.

Next, write your migration as you normally would, using the LargeHadronMigration class instead:

class AddIndexToEmails < LargeHadronMigration def self.up large_hadron_migrate :emails, :wait => 0.2 do |table_name| execute %Q{ alter table %s add index index_emails_on_hashed_address (hashed_address) } % table_name end endend

Be sure to check out the project repo or blog post for advanced usage and caveats.

[Source on GitHub]

No comments:

Post a Comment