最新消息:Welcome to the puzzle paradise for programmers! Here, a well-designed puzzle awaits you. From code logic puzzles to algorithmic challenges, each level is closely centered on the programmer's expertise and skills. Whether you're a novice programmer or an experienced tech guru, you'll find your own challenges on this site. In the process of solving puzzles, you can not only exercise your thinking skills, but also deepen your understanding and application of programming knowledge. Come to start this puzzle journey full of wisdom and challenges, with many programmers to compete with each other and show your programming wisdom! Translated with DeepL.com (free version)

php - How can I efficiently process 50k+ Gravity Forms entries without running into memory or timeout issues?

matteradmin9PV0评论

I'm working on a WordPress shortcode that aggregates data from Gravity Forms entries. I have:

  • Around 50,000 entries on an enrollment form
  • Around 30,000 entries on a feedback form
  • Over 200 trainings
  • I need to calculate averages from a few specific fields across all matching entries for each training and for all trainings as a whole

My initial thought was that I would fetch the entries once and then sort through them for each training and total them up in the process for my global averages.

The issue: Using GFAPI::get_entries() with 'page_size' => 0 or any large range quickly causes:

Allowed memory size of 1073741824 bytes exhausted (tried to allocate 20480 bytes)

When I try chunking with paging, the script then times out (shared hosting — cannot raise limits).

Here’s a simplified version of what I’m doing:

// Now let's get all of the entries
$search_criteria = [
    'status'     => 'active',
    'start_date' => $start, // Which is 2020
    'end_date'   => $end,   // Which is today
];

$enrolled_entries = [];

$chunk_size = 500;
$offset = 0;

do {
    $paging = [
        'offset'    => $offset,
        'page_size' => $chunk_size,
    ];

    $entries_chunk = GFAPI::get_entries( $enroll_form_ids, $search_criteria, [], $paging );
    if ( is_wp_error( $entries_chunk ) ) {
        break;
    }

    $enrolled_entries = array_merge( $enrolled_entries, $entries_chunk );

    $retrieved_count = count( $entries_chunk );
    $offset += $chunk_size;

} while ( $retrieved_count === $chunk_size );

I only need to calculate averages, counts, and basic stats — no need to load every field or the full entry object unless needed.

What I'm looking for:

Is there a better way to efficiently aggregate values across many Gravity Forms entries?

Is there a server-safe way to handle this in chunks, background jobs, or something similar within WordPress?

My full script can be seen here:

Thanks for any help or direction!

Post a comment

comment list (0)

  1. No comments so far