Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
freekmurze authored May 13, 2024
1 parent c1e7bf9 commit ae721b8
Showing 1 changed file with 0 additions and 41 deletions.
41 changes: 0 additions & 41 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,6 @@ The package will automatically register itself.
- [`insertBeforeKey`](#insertbeforekey)
- [`none`](#none)
- [`paginate`](#paginate)
- [`parallelMap`](#parallelmap)
- [`path`](#path)
- [`pluckMany`](#pluckmany)
- [`pluckManyValues`](#pluckmanyvalues)
Expand Down Expand Up @@ -614,46 +613,6 @@ This paginates the contents of `$posts` with 5 items per page. `paginate` accept
paginate(int $perPage = 15, string $pageName = 'page', int $page = null, int $total = null, array $options = [])
```

### `parallelMap`

Identical to `map` but each item in the collection will be processed in parallel. Before using this macro you should pull in the `amphp/parallel-functions` package.

```bash
composer require amphp/parallel-functions
```

Be aware that under the hood some overhead is introduced to make the parallel processing possible. When your `$callable` is only a simple operation it's probably better to use `map` instead. Also keep in mind that `parallelMap` can be memory intensive.

```php
$pageSources = collect($urls)->parallelMap(function($url) {
return file_get_contents($url);
});
```

The page contents of the given `$urls` will be fetched at the same time. The underlying `amp` sets a maximum of `32` concurrent processes by default.

There is a second (optional) parameter, through which you can define a custom parallel processing pool. It looks like this:

```php
use Amp\Parallel\Worker\DefaultPool;

$pool = new DefaultPool(8);

$pageSources = collect($urls)->parallelMap(function($url) {
return file_get_contents($url);
}, $pool);
```

If you don't need to extend the worker pool, or can't be bothered creating the new pool yourself; you can use an integer the the number of workers you'd like to use. A new `DefaultPool` will be created for you:

```php
$pageSources = collect($urls)->parallelMap(function($url) {
return file_get_contents($url);
}, 8);
```

This helps to reduce the memory overhead, as the default worker pool limit is `32` (as defined in `amphp/parallel`). Using fewer worker threads can significantly reduce memory and processing overhead, in many cases. Benchmark and customise the worker thread limit to suit your particular use-case.

### `path`

Returns an item from the collection with multidimensional data using "dot" notation.
Expand Down

0 comments on commit ae721b8

Please sign in to comment.