When inserting large amount of data to DB in Rails, large number of insert statements take a lot of time. We can reduce this time dramatically using mysqlimport, which is mysql tool to insert bulk data into the DB quickly, quoting mysql words about mysqlimport: "reads rows from a text file into a table at a very high speed". To use mysqlimport in updating records also, not only in inserting new records, we used a common trick for this problem: Creating a temporary table, inserting updated records(with ids) in this table, join the temporary table with target table on id column and update target table columns accordingly, then drop the temporary table. You can find this idea illustrated for exampel here: 

So here is a helper class to use it within Rails application.
class SqlWriter

  ID_STR = 'id'
  CREATED_AT_STR = 'created_at'
  UPDATED_AT_STR = 'updated_at'
  NULL_STR = '\N'
  COMMA_STR = ','
  attr_accessor :insert_sql_file, :update_sql_file
  # klass is the class of the records we will deal with
  # sql_dir_path is the directory which will contain the sql data file(text file).
  def initialize(klass, sql_dir_path)
    @klass = klass
    @temp_table_name = "temp_#{klass.table_name}_#{Time.now.to_s(:db).gsub(/-| |:/,'_')}_#{SecureRandom.hex[0..10]}"
    @insert_sql_file = File.new("#{sql_dir_path}/#{klass.table_name}.txt", 'w')
    @update_sql_file = File.new("#{sql_dir_path}/#{@temp_table_name}.txt", 'w')
    @current_time_in_db_format = Time.now.to_s(:db)
    @insert_fields = klass.new.attributes.except(ID_STR).keys
    @update_fields = klass.new.attributes.keys
    @records_need_update = false

  def write_record_to_sql_file(record)
    row_data = get_sql_row(record)
    if record.new_record?

  def insert_records_to_database
    config   = Rails.configuration.database_configuration
    database = config[Rails.env]["database"]
    username = config[Rails.env]["username"]
    password = config[Rails.env]["password"]
    host = config[Rails.env]["host"]
    insert_columns_orders = @insert_fields.join(',')
    `mysqlimport -u #{username} -p#{password} -h #{host} --columns='#{insert_columns_orders}' --local --fields-terminated-by=',' #{database} #{Shellwords.escape(@insert_sql_file.path)}`
    if @records_need_update
      ActiveRecord::Base.connection.execute("CREATE TABLE #{@temp_table_name} LIKE #{@klass.table_name};")
      update_columns_orders = @update_fields.join(',')
      `mysqlimport -u #{username} -p#{password} -h #{host} --columns='#{update_columns_orders}' --local --fields-terminated-by=',' #{database} #{Shellwords.escape(@update_sql_file.path)}`
      set_fields = @insert_fields.map{|field| "#{@klass.table_name}.#{field}=#{@temp_table_name}.#{field}"}.join(',')
      ActiveRecord::Base.connection.execute("UPDATE #{@klass.table_name} INNER JOIN #{@temp_table_name} ON #{@klass.table_name}.id = #{@temp_table_name}.id SET #{set_fields}")
      ActiveRecord::Base.connection.execute("DROP TABLE #{@temp_table_name}")

    def get_sql_row(record)
      if record.new_record?
        result = record.attributes.except(ID_STR).values
        fields = @insert_fields
        result =  record.attributes.values
        fields = @update_fields
        @records_need_update = true
      result.each_with_index do |item, index|
        if item.class == Date || item.class == Time
          result[index] = item.to_s(:db)
        elsif item == true || item == false
          result[index] = item ? 1 : 0
        elsif item == nil
          if fields[index] == CREATED_AT_STR || fields[index] == UPDATED_AT_STR
            result[index] = @current_time_in_db_format
            result[index] = NULL_STR

For example, assume that we are inserting large number of records of User model:
sql_file_dir = "path/to/some/dir"
sql_writer = BulkDataWriter.new(User, sql_file_dir)
alot_of_data.each do |data|
  user = User.new(user_attributes)
And you will have you data inserted to DB!

When choosing a DMBS for your project, you will find many DBMS with different performance tradeoffs. A wise decision should be based on the use case that you are interested in, some projects need to scale up for millions of users and it would do care for response time to serve one of these users. Other projects may be limited to small number of users who will perform heavy queries for generating detailed reports.

In this post we will discuss some use cases that seem to be common in many different systems, and we will conclude why or why not to use a specific DBMS for each use case:


Rails Translation Center Gem

June 16th 2013, 2:00 amCategory: Rails 0 comments

A common task in most of Rails applications is to use internationalization. You have to work with yaml and text files to manage your web application translations. Sometimes, you need your system users to help you translating it to several languages. So, you need a user friendly and easy tool to translate your web application.

If you are already using Heroku on your Mac or Linux with some email like me@x.com and you want change the currently active username on your Mac to other like me@y.com, you should do the following:

Installing RedCloth gem on Windows 7 64-bit

January 20th 2011, 2:51 amCategory: Rails 1 comments

While trying to install the RedCloth Gem on my windows machine, I got the following error message:


"Installing RedCloth (4.2.2) D:/Ruby187/lib/ruby/site_ruby/1.8/rubygems/defaults/operating_system.rb:9: The 'RedCloth' native gem requires installed build tools."


After some trials and Goggling, I've found the following solution and it worked:

Gem install RedCloth --platform=mswin32

UPDATE: 7th Feb, 2012


To use bundler, write the following in the Gemfile:

gem 'RedCloth', :platforms => :mswin