nathanbreitsch / aiomultiprocess

Take a modern Python codebase to the next level of performance.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

aiomultiprocess

Take a modern Python codebase to the next level of performance.

build status version license code style

On their own, AsyncIO and multiprocessing are useful, but limited: AsyncIO still can't exceed the speed of GIL, and multiprocessing only works on one task at a time. But together, they can fully realize their true potential.

aiomultiprocess presents a simple interface, while running a full AsyncIO event loop on each child process, enabling levels of concurrency never before seen in a Python application. Each child process can execute multiple coroutines at once, limited only by the workload and number of cores available.

Gathering tens of thousands of network requests in seconds is as easy as:

async with Pool() as pool:
    results = await pool.map(<coroutine>, <items>)

For more context, watch the PyCon US 2018 talk about aiomultiprocess, "Thinking Outside the GIL":

<iframe width="800" height="450" src="https://www.youtube-nocookie.com/embed/0kXaLh8Fz3k?rel=0" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe>

Install

aiomultiprocess requires Python 3.6 or newer. You can install it from PyPI:

$ pip3 install aiomultiprocess

Usage

Most of aiomultiprocess mimics the standard multiprocessing module whenever possible, while accounting for places that benefit from async functionality.

Executing a coroutine on a child process is as simple as:

from aiohttp import request
from aiomultiprocess import Process

async def fetch(url):
    return await request("GET", url)

p = Process(target=fetch, args="https://jreese.sh")
p.start()
await p.join()

If you want to get results back from that coroutine, Worker makes that available:

from aiohttp import request
from aiomultiprocess import Worker

async def fetch(url):
    return await request("GET", url)

p = Worker(target=fetch, args="https://jreese.sh")
p.start()
response = await p.join()

If you want a managed pool of worker processes, then use Pool:

from aiohttp import request
from aiomultiprocess import Pool

async def fetch(url):
    return await request("GET", url)

url = ["https://jreese.sh", ...]
async with Pool() as pool:
    result = await pool.map(fetch, urls)

License

aiomultiprocess is copyright John Reese, and licensed under the MIT license. I am providing code in this repository to you under an open source license. This is my personal repository; the license you receive to my code is from me and not from my employer. See the LICENSE file for details.

About

Take a modern Python codebase to the next level of performance.

License:MIT License


Languages

Language:Python 97.3%Language:Makefile 2.7%