[ACCEPTED]-MySql Bulk insert-bulkinsert

Accepted answer
Score: 16

You could write a single insert query that would 4 do several inserts in a single call to the 3 database :

insert into your_table (field1, field2, field3)
values 
  (value1_1, value1_2, value1_3), 
  (value2_1, value2_2, value2_3), 
  (value3_1, value3_2, value3_3)


Here, with the example I've given, this 2 single query would have inserted three rows 1 in the table.

Score: 8

MySQL's LOAD DATA command might be useful to you: http://dev.mysql.com/doc/refman/5.5/en/load-data.html

With 5 reference to Pascal's suggestion unless 4 your command exceeds max_allowed_packet then you should be 3 able to execute this query. In many cases 2 it works best to create few smaller inserts 1 with say 1000 rows in each.

Score: 6

You can execute your statements in batch, some code example can be found here.

Also, setAutoCommit(false), and call conn.commit(); after executeBatch() to minimise 1 the number of commits.

Score: 2

Insert bulk more than 7000000 record in 1 minutes in database(superfast query with calculation)

    mysqli_query($cons, '
    LOAD DATA LOCAL INFILE "'.$file.'"
    INTO TABLE tablename
    FIELDS TERMINATED by \',\'
    LINES TERMINATED BY \'\n\'
    IGNORE 1 LINES
    (isbn10,isbn13,price,discount,free_stock,report,report_date)
     SET RRP = IF(discount = 0.00,price-price * 45/100,IF(discount = 0.01,price,IF(discount != 0.00,price-price * discount/100,@RRP))),
         RRP_nl = RRP * 1.44 + 8,
         RRP_bl = RRP * 1.44 + 8,
         ID = NULL
    ')or die(mysqli_error());
    $affected = (int) (mysqli_affected_rows($cons))-1; 
    $log->lwrite('Inventory.CSV to database:'. $affected.' record inserted successfully.');

RRP and RRP_nl and RRP_bl is not in csv 2 but we are calculated that and after insert 1 that.

Score: 0

In mySql you can use load data infile

LOAD DATA INFILE 'C:\MyTextFile'
INTO TABLE myDatabase.MyTable
FIELDS TERMINATED BY ','

0

More Related questions