Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: HTTP/2 Python-Asyncio Web Microframework (gitlab.com/pgjones)
173 points by pgjones on April 23, 2018 | hide | past | favorite | 42 comments



Neat, good to see more `asyncio` frameworks coming along.

Python's going to have a bit of an awkward time with two completely different sets of ecosystem for threaded vs. asyncio approaches, but it's necessary progress.

One thing I'd be really keen to see is asyncio frameworks starting to consider adopting ASGI as a common interface. Each of quart, sanic, aiohttp currently all have their own gunicorn worker classes, http parsing, and none share the same common interface for handling the request/response interface between server and application.

It's a really high barrier to new asyncio frameworks, and it means we're not able to get shared middleware, such as WSGI's whitenoise or Werkzeug debugger, or the increased robustness that shared server implementations would tend to result in.

Would be interested to know what OP's position on this is?


> Python's going to have a bit of an awkward time with two completely different sets of ecosystem for threaded vs. asyncio approaches, but it's necessary progress.

It's threaded vs async/await (and even then projects like https://github.com/dabeaz/curio bridge that gap really well), rather than threaded vs asyncio. asyncio is just one (really really poor) implementation of async/await coroutines on Python.

> One thing I'd be really keen to see is asyncio frameworks starting to consider adopting ASGI as a common interface.

From what I've seen of the ASGI spec, it makes it incredibly easy (like most asyncio stuff) to DoS yourself with lack of backpressure. You get your callback called with data, and like all callback systems you can't exactly not get called.


> It's threaded vs async/await

Fair enough, sure.

> lack of backpressure

You’ll get new calls into the application on new requests, yes. Request bodies are pulled tho.

You can perfectly well ensure that server implementations properly handle flow control, and if necessary also have a configurable number of maximum concurrent requests.

Either way, those sorts of concerns would be far better addressed by having server implementations against a common interface, than by framework authors having to handle (and continually re-implement) the nitty gritty details of high/low watermarks and pausing/resuming transports.


> asyncio is just one (really really poor) implementation of async/await coroutines on Python.

Are there other implementations or is it just that the one existing (asyncio) is just poor?

I tried to understand asyncio a few times and failed. Threading was easy, Promises in JS were easy and I blocked on asyncio.


Me too. Grokked JS promises but async/await threw me. Why is single-threaded async, eg. Node and Python/AsyncIO, such a big deal when we have languages like Elixir, Go and Clojure which have real concurrency and no trade-offs?


There are other implementations, such as Curio (https://github.com/dabeaz/curio) or Trio (https://github.com/python-trio/trio). Both are simpler and more hand-holdy (and less footgun) than asyncio.

Disclaimer: I am a trio dev, though.


Thanks, the trio tutorial is really enlightening!

I was hoping (with Python, not trio in particular) that it would all end up with something like Promises which make coding really easy but I guess not.

This is of course a matter of personal taste, but having

  call_a_asynch_function(with, some, parameters)
  .then(with_the_result_when_it_comes => do_something)
  some_other_codre_which_runs_synchronously
  and_some_more
seems easier to my eye than the whole idea of calling await in async functions


Python doesn't really have the syntax for it, the promise library uses lambda which is pretty bulky: https://pypi.org/project/promise/

I wonder if you could do something nicer using the `with` syntax?

    with async_fn(with, params) as result:
        do_something(result)
    this_runs_sync()
    then_this_does()
It's still going to be annoying when they nest, maybe an async with which makes its whole content act like promises with `then` calls?

    async with fetch_thing() as result:
        json = async_jsonify(result)
        post_result = async_post(json)
        json_result = async_jsonify(post_result)
    then_sync_stuff()
Though you also want to be able to return promises, I suppose you could allow `return async with`, and you also need to be able to catch them. Not sure if this makes any sense to look further at, but I don't think we're getting arrow functions and the JS syntax without those is pretty clunky.


What's wrong with asyncio? For those of us not in the loop.


I'm interested in exploring ASGI as a common interface for Datasette, but I also want the convenience of a neat request and response object. Is there an ASGI equivalent of something like https://webob.org/ yet?


Nope, but probably going to be working towards a library similar to that/werkzeug. Request & Response objects, plus some standard middleware.


more like threaded vs asyncio vs gevent vs tornado vs libuv vs twisted vs multiprocessing


Isn't this what sanic was supposed to be?

https://github.com/channelcat/sanic


Sanic is Flask-like, whereas Quart is the Flask-API (some Flask extensions work with Quart https://pgjones.gitlab.io/quart/flask_extensions.html#suppor...).

The key part I wanted to highlight here though is that Quart serves HTTP/2, which I think sets it apart from most of the other Python frameworks. (I know Twisted also does this https://medium.com/python-pandemonium/how-to-serve-http-2-us...)


+1 been using sanic for a couple years, it's an awesome easily grokable micro framework. Sanic is flask-like in a lot of places where a quick perusal of the quart code makes it look like they go an extra mile on the flask scale, but maybe take up some complexity in that process. Up to the consumer on which trade off they pick. I tend to think that since there is no WSGI equivalent for asyncio (something people are thinking about https://github.com/channelcat/sanic/issues/761) you want a slightly different model than flask anyway.


there is ASGI https://github.com/django/asgiref/blob/master/specs/asgi.rst

but i am not sure how broadly as a common spec this has been accepted.


Unlike flask and OP's solution sanic seems to have a saner API without global request object.


This looks really nice, and Flask compatibility is great.

One question, in the asyncio docs (really nice BTW, I hadn't tried that out yet and instantly grasped your example) you mention the common pitfall of `await awaitable.attribute` with missing brackets. In the Migration from Flask docs you give some examples where you need await like `await request.data` and `await request.get_json()` - do these need brackets in the same way or is `request` special? Same deal with `test_client` straight after that.

BTW, do all routes have to be async here - even your quickstart that just returns 'hello'?

One other thing - since you require Python 3.6 anyway it'd probably make sense to use `pipenv` instead of `venv` as your recommended install, it'd probably make your docs simpler.


For the common pitfall question, when you do `await expression` python will resolve `expression` and then await it. When you do `await request.json()`, python calls `request.json()` and then awaits the return value (which will return json, at some point).

The same thing with `await request.data`. You can easily avoid this pitfall by writing this code:

    data = await resp.json()
    print(data['attribute'])
instead of this:

    print((await resp.json())['attribute'])
TBH I find that section of the docs more confusing than helpful, it's not an issue you'd generally run into IMO.


`request` itself isn't an awaitable, whereas the `data` property and `get_json` method both return awaitables (https://gitlab.com/pgjones/quart/blob/master/quart/wrappers/...). I think these are probably things that become second nature through usage.

The routes don't have to be async as Quart will wrap them in a coroutine anyway. However I'm not sure there is any advantage to not adding the async keyword.

Thanks, I need to learn how pipenv works.


Pipenv is not bad. I switched to it in about 30 minutes.

To start a new project:

    pipenv --three
To add packages:

    pipenv install <package>
    pipenv install -d <development packages>
    pipenv install -r requirements.txt
Then check in your Pipfile and Pipfile.lock.

When someone makes a fresh checkout, they cd to where the Pipfile and Pipfile.lock are and run:

    pipenv install
When you're ready to update your dependencies do the following:

    rm Pipfile.lock
    pipenv update (-d to load development packages)
To get a shell, cd to where the Pipfile of your project is:

    pipenv shell
To run something in the virtualenv when the virtualenv is not activated:

    pipenv run <command>


The author was on a recent episode of Talk Python. https://talkpython.fm/episodes/show/147/quart-flask-but-3x-f...


I think this is all well and good, especially the compatibility with Flask. However, the biggest issue is the data access layer. Most flask apps will use an ORM like SQLAlchemy(SA) to create their data access layer.

SA unfortunately, does not have a asynchronous version (its quite complex as it is). Therefore, I think it would require quite a lot of work, in order to actually get a standard flask app to work with quart.

However, if you've built your data access layer directly on psycopg, then I think you're good to go.


Have you any experience with GINO (https://github.com/fantix/gino) or Peewee-Async (https://github.com/05bit/peewee-async)? These look to be feasible alternatives.


Working on a Quart extension for GINO now.


An interesting deep dive on why this is the case (from 2015), in case anyone wants to know more:

http://techspot.zzzeek.org/2015/02/15/asynchronous-python-an...


One thing I've never been able to reconcile is Mike Bayer's article there, set against Yury Selivanov's reported results for `asyncpg`, which at least as they're presented do indicate a significant difference in throughput.

https://github.com/MagicStack/asyncpg

They're both incredibly experienced developers, and I don't have any good steer on the discrepancy.

Would love to see some independent benchmarking on a typical use case to get a clearer picture on how valuable (or not) asyncio is for high-throughput database operations.


Ehh...

Maybe using eventlet isn't necessarily native "async" but... it works just fine in our use cases...

http://eventlet.net/doc/modules/db_pool.html


I started dabbling with Quart a few months ago and it looks really promising. But I haven't developed any apps with it. My plan is to try building an app using it instead of Flask, I like Flask but I think this has some benefits in supporting the current future including websockets.


I've seen a few libraries like this now. Are they supposed to supersede the non asyncio versions? Is there any reason to still use flask? Will they be maintained like forks?


could you add sqlalchemy integration docs as well ? because that is where I get stuck when thinking about async. How does psycogreen work with you asyncio apis ?


The sqlalchemy orm and asyncio are incompatible. There are packages like aiopg that allow for sqlalchemy to be used as a functional sql layer.

If you want to use the orm parts of sqlalchemy, the idea is to use thread executors and handle detaching of objects if you return them to event-loop code.


Well, then how should we do database access ? Because without that part sorted out the rest of the framework is unusable.

Incidentally, my personal opinion is that is the reason why we don't have massive adoption of asyncio despite lots of frameworks (like apistar).

Nodejs for example has every database driver as asynchronous. It is probably worth the effort to make a asyncio compatible dB layer.


Cool! Might try this for some smaller project. It's a shame that asyncio is so slow though, this framework isn't going to win any speed awards I guess.


It isn't compared with other languages, however Quart sits well in the Python ecosystem, especially with uvloop. I wrote some of this up https://hackernoon.com/3x-faster-than-flask-8e89bfbe8e4f


Why did you choose to host this on gitlab over github?


I like the open source nature of gitlab, and its feature set. There is a mirror on github as well, mostly as people tend to look on github first.


Because everything doesn't need to be hosted on GitHub?


I don't think that was running through the project author's mind as a factor.

It is interesting to just know the reasoning. Nobody is bashing competition. It's just that GitHub has more mindshare and as such, contributors/forks are more likely, and there are more 3rd party integrations.

Again, competition is good for everyone, but let's not bash someone who is just curious why a project author would come to choose GitLab.


It might be that the author likes the fact that GitLab offers free private projects, which might be useful for his other work.

Or likes the interface over GitHub.

Or simply likes going against mainstream.

Full disclosure: I use GitLab since 2015.


Weird question isn't it? We should be happy there are several great places to host code :)


Who says that OP isn't happy there is competition? I am. I'm also curious on the reasoning.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: