File Processing Using File_get_contents Encounters Memory Exhaustion
When working with large files in PHP, using the file_get_contents function to fetch the entire file contents into a variable can cause memory exhaustion errors. This is because the variable containing the file contents resides in memory, and for large files, the allocated memory limit can be exceeded.
To overcome this issue, a more efficient approach is to use file pointers and process the file in chunks. This way, only the current portion of the file is held in memory at any given time.
Here's a custom function that implements this chunked file processing:
function file_get_contents_chunked($file, $chunk_size, $callback)
{
try {
$handle = fopen($file, "r");
$i = 0;
while (!feof($handle)) {
call_user_func_array($callback, [fread($handle, $chunk_size), &$handle, $i]);
$i ;
}
fclose($handle);
return true;
} catch (Exception $e) {
trigger_error("file_get_contents_chunked::" . $e->getMessage(), E_USER_NOTICE);
return false;
}
}
To use this function, define a callback function to handle each chunk of data:
$success = file_get_contents_chunked("my/large/file", 4096, function($chunk, &$handle, $iteration) {
// Perform file processing here
});
Additionally, consider refactoring your regex operations to use native string functions like strpos, substr, trim, and explode. This can significantly improve performance when working with large files.
Disclaimer: All resources provided are partly from the Internet. If there is any infringement of your copyright or other rights and interests, please explain the detailed reasons and provide proof of copyright or rights and interests and then send it to the email: [email protected] We will handle it for you as soon as possible.
Copyright© 2022 湘ICP备2022001581号-3