Томас Петр Томас Петр - 1 month ago 4
JSON Question

json_decode - how to speed it up in GuzzleHttp asnyc request

In my application I'm using the GuzzleHttp library, but it not probably the problem, but is't good to say it.

Every minute (using cron) I need to get data from 40+ addresses, so I took GuzzleHttp lib to be fast as possible.

Guzzle code:

$client = new Client();
$rectangles = $this->db->query("SELECT * FROM rectangles");

$requests = function ($rectangles)
foreach($rectangles as $rectangle)
// some GEO coords (It's not important)
$left = $rectangle["lft"];
$right = $rectangle["rgt"];
$top = $rectangle["top"];
$bottom = $rectangle["bottom"];

$this->indexes[] = $rectangle;

$uri = "https://example.com/?left=$left&top=$top&right=$right&bototm=$bottom";
yield new Request("GET", $uri);


$pool = new Pool($client, $requests($rectangles), [
'concurrency' => 5,
'fulfilled' => function ($response, $index) {
$resp = $response->getBody();
$carray = json_decode($resp,true);

if($carray["data"] != null)
$alerts = array_filter($carray["data"], function($alert) {
return $alert["type"] == 'xxx';

$this->data = array_merge($this->data, $alerts);
$this->total_count += count($alerts);
'rejected' => function ($reason, $index) {},

$promise = $pool->promise();

return $this->data;

Of course i made a benchmark of this.

1. getting data from another server 0.000xx sec
2. json_decode 0.001-0.0100 (this is probably the problem :-()

The entire code takes about 6-8 seconds. It depends on the amount of data that is on a remote server.

All the time I thought Guzzle performs request asynchronously, so it will takes time as the longest request.

(slowest request = 200 ms == all request = 200 ms) - But this is probably not true! Or I am doing something wrong.

I used an associative array in json_decode (I feel that this is an acceleration of 1 sec (I'm not sure...)).

My question is, can I this code more optimize and speed it up?

I wish to make it fast as one the slowest request (0.200 sec).

PS: The data that I'm getting from URLs are just long JSONs. Thanks!

EDIT: I changed the 'concurrency' => 5 to 'concurrency' => 100 and now the duration is about 2-4 sec


To start, increase the concurrency value in the Pool config to the total number of requests you need to send. This should be fine and may in fact get you even faster.

In regards to speeding up json_decode by milliseconds, this probably depends on a lot of factors including the hardware you are using on the server that processes the JSON as well the varying sizes of the JSON data. I don't think there is something you could do programmatically in PHP to speed up that core function. I could be wrong though.

Another part of your code to look at is: $this->data = array_merge($this->data, $alerts); You could try using a loop instead.

You also are performing double work with array_filter where internally the array is being iterated over before the array_merge.

So, instead of:

if ($carray["data"] != null) {
    $alerts = array_filter($carray["data"], function($alert) {
        return $alert["type"] == 'xxx';

    $this->data = array_merge($this->data, $alerts);
    $this->total_count += count($alerts);

Maybe try this:

if ($carray["data"] != null) {
    foreach ($carray["data"] as $cdata) {
        if ($cdata["type"] == 'xxx') {
            $this-data[] = $cdata;