[wp-trac] [WordPress Trac] #34848: Add support for updating post meta in bulk

WordPress Trac noreply at wordpress.org
Sat May 14 04:26:21 UTC 2016


#34848: Add support for updating post meta in bulk
------------------------------------------+-----------------------------
 Reporter:  patrickgarman                 |       Owner:  chriscct7
     Type:  enhancement                   |      Status:  assigned
 Priority:  normal                        |   Milestone:  Future Release
Component:  Options, Meta APIs            |     Version:
 Severity:  normal                        |  Resolution:
 Keywords:  needs-patch needs-unit-tests  |     Focuses:  performance
------------------------------------------+-----------------------------

Comment (by boonebgorges):

 Thanks for the additional thoughts, @patrickgarman.

 I've spent some time thinking more about this. My first thought is that it
 probably does make more sense to go with different function names, rather
 than trying to wedge this functionality into the existing functions. The
 function signatures and return values would be too different to make any
 sense.

 That being said, I've run a bunch of tests, and have found mixed
 performance results. In many cases, looping over `add_metadata()` is
 actually much *faster* than running a single `INSERT INTO` with lots of
 values. [attachment:meta-bulk.php] shows what I've been testing - meant to
 be run `$ wp eval-file /path/to/meta-bulk.php`. Comment stuff out to test
 `add_metadatas()` vs an `add_metadata()` loop. Obviously, these tests are
 not scientific, but a bit of testing shows that bulk `INSERT` statements
 are on the magnitude of 2x *slower*, though depending on the size of the
 `meta_value` and `meta_key` strings the memory footprint can be somewhat
 less.

 I also found that behavior of `add_metadatas()` tended to get pretty weird
 at very large sizes. PHP had a hard time dealing with string concatenation
 after a certain length.

 Am I doing something wrong? Maybe there's a kind of metadata payload that
 I haven't thought of, where a bulk `INSERT` performs much better.

 You'll also see that I experimented with wrapping the `add_metadata()`
 loop in a transaction. This *did* have a noticeable performance benefit -
 depending on the nature of the metadata, anywhere from a 20-40% decrease
 in time elapsed. I would be somewhat wary of introducing a function for
 this into core, though - transactions are vulnerable to data loss if
 they're interrupted, and for certain kinds of payload, there can be
 performance problems during the commit. There might be benefits to using
 transactions like this in some places (I'm thinking of the importer -
 @rmccue have you thought about this kind of thing?), but it would have to
 be in situations where the routine is idempotent or has other failsafes
 built in. There might also be considerations related to database
 replication and the like that make this improvement the kind of thing that
 ought to be handled on an implementation-specific basis.

 @pento Do you have any wisdom to impart about the potential benefits of
 (a) large `INSERT INTO ... VALUES` queries, and/or (b) running large
 numbers of small `INSERT` or `UPDATE` queries inside of a transaction for
 the performance benefit?

--
Ticket URL: <https://core.trac.wordpress.org/ticket/34848#comment:19>
WordPress Trac <https://core.trac.wordpress.org/>
WordPress publishing platform


More information about the wp-trac mailing list