[ACCEPTED]-How can I insert large files in MySQL db using PHP?-large-files

Accepted answer
Score: 15

You will want to check the MySQL configuration 8 value "max_allowed_packet", which 7 might be set too small, preventing the INSERT 6 (which is large itself) from happening.

Run 5 the following from a mysql command prompt:

mysql> show variables like 'max_allowed_packet';

Make 4 sure its large enough. For more information 3 on this config option see

MySQL max_allowed_packet

This also impacts 2 mysql_escape_string() and mysql_real_escape_string() in 1 PHP limiting the size of the string creation.

Score: 8

As far as I know it's generally quicker 11 and better practice not to store the file 10 in the db as it will get massive very quickly 9 and slow it down. It's best to make a way 8 of storing the file in a directory and then 7 just store the location of the file in the 6 db.

We do it for images/pdfs/mpegs etc in 5 the CMS we have at work by creating a folder 4 for the file named from the url-safe filename 3 and storing the folder name in the db. It's 2 easy just to write out the url of it in 1 the presentation layer then.

Score: 4

Some PHP extensions for MySQL have issues 13 with LONGBLOB and LONGTEXT data types. The 12 extensions may not support blob streaming 11 (posting the blob one segment at a time), so 10 they have to post the entire object in one 9 go.

So if PHP's memory limit or MySQL's 8 packet size limit restrict the size of an 7 object you can post to the database, you 6 may need to change some configuration on 5 either PHP or MySQL to allow this.

You didn't 4 say which PHP extension you're using (there 3 are at least three for MySQL), and you didn't 2 show any of the code you're using to post 1 the blob to the database.

Score: 4

The best answer is to use an implementation 7 that is better and also works around that 6 issue. You can read an article here. Store 5 10MB, 1000MB, doesn't matter. The implementation 4 chunks/cuts the file into many smaller pieces 3 and stores them in multiple rows.. This 2 helps with load and fetching so memory doesn't 1 also become an issue.

Score: 3

You could use MySQL's LOAD_FILE function to store the file, but 3 you still have to obey the max_allowed_packet 2 value and the fact that the file must be 1 on the same server as the MySQL instance.

Score: 0

You don't say what error you're getting 3 (use mysql_error() to find out), but I suspect you may 2 be hitting the maximum packet size.

If this 1 is the case, you'd need to change your MySQL configuration max_allowed_packet

Score: 0

You don't say what error you're getting 11 (use mysql_error() to find out), but I suspect 10 you may be hitting the maximum packet size.

If 9 this is the case, you'd need to change your 8 MySQL configuration max_allowed_packet

Well 7 I have the same problem. And data cannot 6 be entered in the mysql database chunck 5 by chunck in a "io mode"

loop for : 
   read $data from file,
   write $data to blob
end loop
close file
close blob

A solution seems 4 to create a table with multi-part blobs 3 like create table data_details ( id int 2 pk auto_increment, chunck_number int not 1 null, dataPart blob ); ???

More Related questions