first commit
This commit is contained in:
@@ -0,0 +1 @@
|
||||
pip
|
||||
@@ -0,0 +1,28 @@
|
||||
Copyright 2010 Pallets
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are
|
||||
met:
|
||||
|
||||
1. Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
|
||||
2. Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in the
|
||||
documentation and/or other materials provided with the distribution.
|
||||
|
||||
3. Neither the name of the copyright holder nor the names of its
|
||||
contributors may be used to endorse or promote products derived from
|
||||
this software without specific prior written permission.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
|
||||
PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED
|
||||
TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
|
||||
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
|
||||
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
|
||||
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
@@ -0,0 +1,137 @@
|
||||
Metadata-Version: 2.1
|
||||
Name: Flask
|
||||
Version: 1.1.2
|
||||
Summary: A simple framework for building complex web applications.
|
||||
Home-page: https://palletsprojects.com/p/flask/
|
||||
Author: Armin Ronacher
|
||||
Author-email: armin.ronacher@active-4.com
|
||||
Maintainer: Pallets
|
||||
Maintainer-email: contact@palletsprojects.com
|
||||
License: BSD-3-Clause
|
||||
Project-URL: Documentation, https://flask.palletsprojects.com/
|
||||
Project-URL: Code, https://github.com/pallets/flask
|
||||
Project-URL: Issue tracker, https://github.com/pallets/flask/issues
|
||||
Platform: UNKNOWN
|
||||
Classifier: Development Status :: 5 - Production/Stable
|
||||
Classifier: Environment :: Web Environment
|
||||
Classifier: Framework :: Flask
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: License :: OSI Approved :: BSD License
|
||||
Classifier: Operating System :: OS Independent
|
||||
Classifier: Programming Language :: Python
|
||||
Classifier: Programming Language :: Python :: 2
|
||||
Classifier: Programming Language :: Python :: 2.7
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: 3.5
|
||||
Classifier: Programming Language :: Python :: 3.6
|
||||
Classifier: Programming Language :: Python :: 3.7
|
||||
Classifier: Programming Language :: Python :: 3.8
|
||||
Classifier: Programming Language :: Python :: Implementation :: CPython
|
||||
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
||||
Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
|
||||
Classifier: Topic :: Internet :: WWW/HTTP :: WSGI :: Application
|
||||
Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
|
||||
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
||||
Requires-Python: >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*
|
||||
Requires-Dist: Werkzeug (>=0.15)
|
||||
Requires-Dist: Jinja2 (>=2.10.1)
|
||||
Requires-Dist: itsdangerous (>=0.24)
|
||||
Requires-Dist: click (>=5.1)
|
||||
Provides-Extra: dev
|
||||
Requires-Dist: pytest ; extra == 'dev'
|
||||
Requires-Dist: coverage ; extra == 'dev'
|
||||
Requires-Dist: tox ; extra == 'dev'
|
||||
Requires-Dist: sphinx ; extra == 'dev'
|
||||
Requires-Dist: pallets-sphinx-themes ; extra == 'dev'
|
||||
Requires-Dist: sphinxcontrib-log-cabinet ; extra == 'dev'
|
||||
Requires-Dist: sphinx-issues ; extra == 'dev'
|
||||
Provides-Extra: docs
|
||||
Requires-Dist: sphinx ; extra == 'docs'
|
||||
Requires-Dist: pallets-sphinx-themes ; extra == 'docs'
|
||||
Requires-Dist: sphinxcontrib-log-cabinet ; extra == 'docs'
|
||||
Requires-Dist: sphinx-issues ; extra == 'docs'
|
||||
Provides-Extra: dotenv
|
||||
Requires-Dist: python-dotenv ; extra == 'dotenv'
|
||||
|
||||
Flask
|
||||
=====
|
||||
|
||||
Flask is a lightweight `WSGI`_ web application framework. It is designed
|
||||
to make getting started quick and easy, with the ability to scale up to
|
||||
complex applications. It began as a simple wrapper around `Werkzeug`_
|
||||
and `Jinja`_ and has become one of the most popular Python web
|
||||
application frameworks.
|
||||
|
||||
Flask offers suggestions, but doesn't enforce any dependencies or
|
||||
project layout. It is up to the developer to choose the tools and
|
||||
libraries they want to use. There are many extensions provided by the
|
||||
community that make adding new functionality easy.
|
||||
|
||||
|
||||
Installing
|
||||
----------
|
||||
|
||||
Install and update using `pip`_:
|
||||
|
||||
.. code-block:: text
|
||||
|
||||
pip install -U Flask
|
||||
|
||||
|
||||
A Simple Example
|
||||
----------------
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from flask import Flask
|
||||
|
||||
app = Flask(__name__)
|
||||
|
||||
@app.route("/")
|
||||
def hello():
|
||||
return "Hello, World!"
|
||||
|
||||
.. code-block:: text
|
||||
|
||||
$ env FLASK_APP=hello.py flask run
|
||||
* Serving Flask app "hello"
|
||||
* Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)
|
||||
|
||||
|
||||
Contributing
|
||||
------------
|
||||
|
||||
For guidance on setting up a development environment and how to make a
|
||||
contribution to Flask, see the `contributing guidelines`_.
|
||||
|
||||
.. _contributing guidelines: https://github.com/pallets/flask/blob/master/CONTRIBUTING.rst
|
||||
|
||||
|
||||
Donate
|
||||
------
|
||||
|
||||
The Pallets organization develops and supports Flask and the libraries
|
||||
it uses. In order to grow the community of contributors and users, and
|
||||
allow the maintainers to devote more time to the projects, `please
|
||||
donate today`_.
|
||||
|
||||
.. _please donate today: https://psfmember.org/civicrm/contribute/transact?reset=1&id=20
|
||||
|
||||
|
||||
Links
|
||||
-----
|
||||
|
||||
* Website: https://palletsprojects.com/p/flask/
|
||||
* Documentation: https://flask.palletsprojects.com/
|
||||
* Releases: https://pypi.org/project/Flask/
|
||||
* Code: https://github.com/pallets/flask
|
||||
* Issue tracker: https://github.com/pallets/flask/issues
|
||||
* Test status: https://dev.azure.com/pallets/flask/_build
|
||||
* Official chat: https://discord.gg/t6rrQZH
|
||||
|
||||
.. _WSGI: https://wsgi.readthedocs.io
|
||||
.. _Werkzeug: https://www.palletsprojects.com/p/werkzeug/
|
||||
.. _Jinja: https://www.palletsprojects.com/p/jinja/
|
||||
.. _pip: https://pip.pypa.io/en/stable/quickstart/
|
||||
|
||||
|
||||
@@ -0,0 +1,48 @@
|
||||
../../../bin/flask,sha256=98Kkf8ifDYj3i_HtgXsQicxHntmKxSwxdg2vMOEq_IA,249
|
||||
Flask-1.1.2.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
Flask-1.1.2.dist-info/LICENSE.rst,sha256=SJqOEQhQntmKN7uYPhHg9-HTHwvY-Zp5yESOf_N9B-o,1475
|
||||
Flask-1.1.2.dist-info/METADATA,sha256=3INpPWH6nKfZ33R2N-bQZy4TOe1wQCMweZc9mwcNrtc,4591
|
||||
Flask-1.1.2.dist-info/RECORD,,
|
||||
Flask-1.1.2.dist-info/WHEEL,sha256=8zNYZbwQSXoB9IfXOjPfeNwvAsALAjffgk27FqvCWbo,110
|
||||
Flask-1.1.2.dist-info/entry_points.txt,sha256=gBLA1aKg0OYR8AhbAfg8lnburHtKcgJLDU52BBctN0k,42
|
||||
Flask-1.1.2.dist-info/top_level.txt,sha256=dvi65F6AeGWVU0TBpYiC04yM60-FX1gJFkK31IKQr5c,6
|
||||
flask/__init__.py,sha256=YnA9wkwbJcnb_jTT-nMsMFeFE_UWt33khKzdHmMSuyI,1894
|
||||
flask/__main__.py,sha256=fjVtt3QTANXlpJCOv3Ha7d5H-76MwzSIOab7SFD9TEk,254
|
||||
flask/__pycache__/__init__.cpython-38.pyc,,
|
||||
flask/__pycache__/__main__.cpython-38.pyc,,
|
||||
flask/__pycache__/_compat.cpython-38.pyc,,
|
||||
flask/__pycache__/app.cpython-38.pyc,,
|
||||
flask/__pycache__/blueprints.cpython-38.pyc,,
|
||||
flask/__pycache__/cli.cpython-38.pyc,,
|
||||
flask/__pycache__/config.cpython-38.pyc,,
|
||||
flask/__pycache__/ctx.cpython-38.pyc,,
|
||||
flask/__pycache__/debughelpers.cpython-38.pyc,,
|
||||
flask/__pycache__/globals.cpython-38.pyc,,
|
||||
flask/__pycache__/helpers.cpython-38.pyc,,
|
||||
flask/__pycache__/logging.cpython-38.pyc,,
|
||||
flask/__pycache__/sessions.cpython-38.pyc,,
|
||||
flask/__pycache__/signals.cpython-38.pyc,,
|
||||
flask/__pycache__/templating.cpython-38.pyc,,
|
||||
flask/__pycache__/testing.cpython-38.pyc,,
|
||||
flask/__pycache__/views.cpython-38.pyc,,
|
||||
flask/__pycache__/wrappers.cpython-38.pyc,,
|
||||
flask/_compat.py,sha256=8KPT54Iig96TuLipdogLRHNYToIcg-xPhnSV5VRERnw,4099
|
||||
flask/app.py,sha256=tmEhx_XrIRP24vZg39dHMWFzJ2jj-YxIcd51LaIT5cE,98059
|
||||
flask/blueprints.py,sha256=vkdm8NusGsfZUeIfPdCluj733QFmiQcT4Sk1tuZLUjw,21400
|
||||
flask/cli.py,sha256=SIb22uq9wYBeB2tKMl0pYdhtZ1MAQyZtPL-3m6es4G0,31035
|
||||
flask/config.py,sha256=3dejvQRYfNHw_V7dCLMxU8UNFpL34xIKemN7gHZIZ8Y,10052
|
||||
flask/ctx.py,sha256=cks-omGedkxawHFo6bKIrdOHsJCAgg1i_NWw_htxb5U,16724
|
||||
flask/debughelpers.py,sha256=-whvPKuAoU8AZ9c1z_INuOeBgfYDqE1J2xNBsoriugU,6475
|
||||
flask/globals.py,sha256=OgcHb6_NCyX6-TldciOdKcyj4PNfyQwClxdMhvov6aA,1637
|
||||
flask/helpers.py,sha256=IHa578HU_3XAAo1wpXQv24MYRYO5TzaiDQQwvUIcE6Q,43074
|
||||
flask/json/__init__.py,sha256=6nITbZYiYOPB8Qfi1-dvsblwn01KRz8VOsMBIZyaYek,11988
|
||||
flask/json/__pycache__/__init__.cpython-38.pyc,,
|
||||
flask/json/__pycache__/tag.cpython-38.pyc,,
|
||||
flask/json/tag.py,sha256=vq9GOllg_0kTWKuVFrwmkeOQzR-jdBD23x-89JyCCQI,8306
|
||||
flask/logging.py,sha256=WcY5UkqTysGfmosyygSlXyZYGwOp3y-VsE6ehoJ48dk,3250
|
||||
flask/sessions.py,sha256=G0KsEkr_i1LG_wOINwFSOW3ts7Xbv4bNgEZKc7TRloc,14360
|
||||
flask/signals.py,sha256=yYLOed2x8WnQ7pirGalQYfpYpCILJ0LJhmNSrnWvjqw,2212
|
||||
flask/templating.py,sha256=F8E_IZXn9BGsjMzUJ5N_ACMyZdiFBp_SSEaUunvfZ7g,4939
|
||||
flask/testing.py,sha256=WXsciCQbHBP7xjHqNvOA4bT0k86GvSNpgzncfXLDEEg,10146
|
||||
flask/views.py,sha256=eeWnadLAj0QdQPLtjKipDetRZyG62CT2y7fNOFDJz0g,5802
|
||||
flask/wrappers.py,sha256=kgsvtZuMM6RQaDqhRbc5Pcj9vqTnaERl2pmXcdGL7LU,4736
|
||||
@@ -0,0 +1,6 @@
|
||||
Wheel-Version: 1.0
|
||||
Generator: bdist_wheel (0.33.6)
|
||||
Root-Is-Purelib: true
|
||||
Tag: py2-none-any
|
||||
Tag: py3-none-any
|
||||
|
||||
@@ -0,0 +1,3 @@
|
||||
[console_scripts]
|
||||
flask = flask.cli:main
|
||||
|
||||
@@ -0,0 +1 @@
|
||||
flask
|
||||
@@ -0,0 +1 @@
|
||||
pip
|
||||
@@ -0,0 +1,28 @@
|
||||
Copyright 2007 Pallets
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are
|
||||
met:
|
||||
|
||||
1. Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
|
||||
2. Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in the
|
||||
documentation and/or other materials provided with the distribution.
|
||||
|
||||
3. Neither the name of the copyright holder nor the names of its
|
||||
contributors may be used to endorse or promote products derived from
|
||||
this software without specific prior written permission.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
|
||||
PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED
|
||||
TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
|
||||
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
|
||||
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
|
||||
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
@@ -0,0 +1,106 @@
|
||||
Metadata-Version: 2.1
|
||||
Name: Jinja2
|
||||
Version: 2.11.2
|
||||
Summary: A very fast and expressive template engine.
|
||||
Home-page: https://palletsprojects.com/p/jinja/
|
||||
Author: Armin Ronacher
|
||||
Author-email: armin.ronacher@active-4.com
|
||||
Maintainer: Pallets
|
||||
Maintainer-email: contact@palletsprojects.com
|
||||
License: BSD-3-Clause
|
||||
Project-URL: Documentation, https://jinja.palletsprojects.com/
|
||||
Project-URL: Code, https://github.com/pallets/jinja
|
||||
Project-URL: Issue tracker, https://github.com/pallets/jinja/issues
|
||||
Platform: UNKNOWN
|
||||
Classifier: Development Status :: 5 - Production/Stable
|
||||
Classifier: Environment :: Web Environment
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: License :: OSI Approved :: BSD License
|
||||
Classifier: Operating System :: OS Independent
|
||||
Classifier: Programming Language :: Python
|
||||
Classifier: Programming Language :: Python :: 2
|
||||
Classifier: Programming Language :: Python :: 2.7
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: 3.5
|
||||
Classifier: Programming Language :: Python :: 3.6
|
||||
Classifier: Programming Language :: Python :: 3.7
|
||||
Classifier: Programming Language :: Python :: 3.8
|
||||
Classifier: Programming Language :: Python :: Implementation :: CPython
|
||||
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
||||
Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
|
||||
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
||||
Classifier: Topic :: Text Processing :: Markup :: HTML
|
||||
Requires-Python: >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*
|
||||
Description-Content-Type: text/x-rst
|
||||
Requires-Dist: MarkupSafe (>=0.23)
|
||||
Provides-Extra: i18n
|
||||
Requires-Dist: Babel (>=0.8) ; extra == 'i18n'
|
||||
|
||||
Jinja
|
||||
=====
|
||||
|
||||
Jinja is a fast, expressive, extensible templating engine. Special
|
||||
placeholders in the template allow writing code similar to Python
|
||||
syntax. Then the template is passed data to render the final document.
|
||||
|
||||
It includes:
|
||||
|
||||
- Template inheritance and inclusion.
|
||||
- Define and import macros within templates.
|
||||
- HTML templates can use autoescaping to prevent XSS from untrusted
|
||||
user input.
|
||||
- A sandboxed environment can safely render untrusted templates.
|
||||
- AsyncIO support for generating templates and calling async
|
||||
functions.
|
||||
- I18N support with Babel.
|
||||
- Templates are compiled to optimized Python code just-in-time and
|
||||
cached, or can be compiled ahead-of-time.
|
||||
- Exceptions point to the correct line in templates to make debugging
|
||||
easier.
|
||||
- Extensible filters, tests, functions, and even syntax.
|
||||
|
||||
Jinja's philosophy is that while application logic belongs in Python if
|
||||
possible, it shouldn't make the template designer's job difficult by
|
||||
restricting functionality too much.
|
||||
|
||||
|
||||
Installing
|
||||
----------
|
||||
|
||||
Install and update using `pip`_:
|
||||
|
||||
.. code-block:: text
|
||||
|
||||
$ pip install -U Jinja2
|
||||
|
||||
.. _pip: https://pip.pypa.io/en/stable/quickstart/
|
||||
|
||||
|
||||
In A Nutshell
|
||||
-------------
|
||||
|
||||
.. code-block:: jinja
|
||||
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Members{% endblock %}
|
||||
{% block content %}
|
||||
<ul>
|
||||
{% for user in users %}
|
||||
<li><a href="{{ user.url }}">{{ user.username }}</a></li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
{% endblock %}
|
||||
|
||||
|
||||
Links
|
||||
-----
|
||||
|
||||
- Website: https://palletsprojects.com/p/jinja/
|
||||
- Documentation: https://jinja.palletsprojects.com/
|
||||
- Releases: https://pypi.org/project/Jinja2/
|
||||
- Code: https://github.com/pallets/jinja
|
||||
- Issue tracker: https://github.com/pallets/jinja/issues
|
||||
- Test status: https://dev.azure.com/pallets/jinja/_build
|
||||
- Official chat: https://discord.gg/t6rrQZH
|
||||
|
||||
|
||||
@@ -0,0 +1,61 @@
|
||||
Jinja2-2.11.2.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
Jinja2-2.11.2.dist-info/LICENSE.rst,sha256=O0nc7kEF6ze6wQ-vG-JgQI_oXSUrjp3y4JefweCUQ3s,1475
|
||||
Jinja2-2.11.2.dist-info/METADATA,sha256=5ZHRZoIRAMHsJPnqhlJ622_dRPsYePYJ-9EH4-Ry7yI,3535
|
||||
Jinja2-2.11.2.dist-info/RECORD,,
|
||||
Jinja2-2.11.2.dist-info/WHEEL,sha256=kGT74LWyRUZrL4VgLh6_g12IeVl_9u9ZVhadrgXZUEY,110
|
||||
Jinja2-2.11.2.dist-info/entry_points.txt,sha256=Qy_DkVo6Xj_zzOtmErrATe8lHZhOqdjpt3e4JJAGyi8,61
|
||||
Jinja2-2.11.2.dist-info/top_level.txt,sha256=PkeVWtLb3-CqjWi1fO29OCbj55EhX_chhKrCdrVe_zs,7
|
||||
jinja2/__init__.py,sha256=0QCM_jKKDM10yzSdHRVV4mQbCbDqf0GN0GirAqibn9Y,1549
|
||||
jinja2/__pycache__/__init__.cpython-38.pyc,,
|
||||
jinja2/__pycache__/_compat.cpython-38.pyc,,
|
||||
jinja2/__pycache__/_identifier.cpython-38.pyc,,
|
||||
jinja2/__pycache__/asyncfilters.cpython-38.pyc,,
|
||||
jinja2/__pycache__/asyncsupport.cpython-38.pyc,,
|
||||
jinja2/__pycache__/bccache.cpython-38.pyc,,
|
||||
jinja2/__pycache__/compiler.cpython-38.pyc,,
|
||||
jinja2/__pycache__/constants.cpython-38.pyc,,
|
||||
jinja2/__pycache__/debug.cpython-38.pyc,,
|
||||
jinja2/__pycache__/defaults.cpython-38.pyc,,
|
||||
jinja2/__pycache__/environment.cpython-38.pyc,,
|
||||
jinja2/__pycache__/exceptions.cpython-38.pyc,,
|
||||
jinja2/__pycache__/ext.cpython-38.pyc,,
|
||||
jinja2/__pycache__/filters.cpython-38.pyc,,
|
||||
jinja2/__pycache__/idtracking.cpython-38.pyc,,
|
||||
jinja2/__pycache__/lexer.cpython-38.pyc,,
|
||||
jinja2/__pycache__/loaders.cpython-38.pyc,,
|
||||
jinja2/__pycache__/meta.cpython-38.pyc,,
|
||||
jinja2/__pycache__/nativetypes.cpython-38.pyc,,
|
||||
jinja2/__pycache__/nodes.cpython-38.pyc,,
|
||||
jinja2/__pycache__/optimizer.cpython-38.pyc,,
|
||||
jinja2/__pycache__/parser.cpython-38.pyc,,
|
||||
jinja2/__pycache__/runtime.cpython-38.pyc,,
|
||||
jinja2/__pycache__/sandbox.cpython-38.pyc,,
|
||||
jinja2/__pycache__/tests.cpython-38.pyc,,
|
||||
jinja2/__pycache__/utils.cpython-38.pyc,,
|
||||
jinja2/__pycache__/visitor.cpython-38.pyc,,
|
||||
jinja2/_compat.py,sha256=B6Se8HjnXVpzz9-vfHejn-DV2NjaVK-Iewupc5kKlu8,3191
|
||||
jinja2/_identifier.py,sha256=EdgGJKi7O1yvr4yFlvqPNEqV6M1qHyQr8Gt8GmVTKVM,1775
|
||||
jinja2/asyncfilters.py,sha256=XJtYXTxFvcJ5xwk6SaDL4S0oNnT0wPYvXBCSzc482fI,4250
|
||||
jinja2/asyncsupport.py,sha256=ZBFsDLuq3Gtji3Ia87lcyuDbqaHZJRdtShZcqwpFnSQ,7209
|
||||
jinja2/bccache.py,sha256=3Pmp4jo65M9FQuIxdxoDBbEDFwe4acDMQf77nEJfrHA,12139
|
||||
jinja2/compiler.py,sha256=Ta9W1Lit542wItAHXlDcg0sEOsFDMirCdlFPHAurg4o,66284
|
||||
jinja2/constants.py,sha256=RR1sTzNzUmKco6aZicw4JpQpJGCuPuqm1h1YmCNUEFY,1458
|
||||
jinja2/debug.py,sha256=neR7GIGGjZH3_ILJGVUYy3eLQCCaWJMXOb7o0kGInWc,8529
|
||||
jinja2/defaults.py,sha256=85B6YUUCyWPSdrSeVhcqFVuu_bHUAQXeey--FIwSeVQ,1126
|
||||
jinja2/environment.py,sha256=XDSLKc4SqNLMOwTSq3TbWEyA5WyXfuLuVD0wAVjEFwM,50629
|
||||
jinja2/exceptions.py,sha256=VjNLawcmf2ODffqVMCQK1cRmvFaUfQWF4u8ouP3QPcE,5425
|
||||
jinja2/ext.py,sha256=AtwL5O5enT_L3HR9-oBvhGyUTdGoyaqG_ICtnR_EVd4,26441
|
||||
jinja2/filters.py,sha256=_RpPgAlgIj7ExvyDzcHAC3B36cocfWK-1TEketbNeM0,41415
|
||||
jinja2/idtracking.py,sha256=J3O4VHsrbf3wzwiBc7Cro26kHb6_5kbULeIOzocchIU,9211
|
||||
jinja2/lexer.py,sha256=nUFLRKhhKmmEWkLI65nQePgcQs7qsRdjVYZETMt_v0g,30331
|
||||
jinja2/loaders.py,sha256=C-fST_dmFjgWkp0ZuCkrgICAoOsoSIF28wfAFink0oU,17666
|
||||
jinja2/meta.py,sha256=QjyYhfNRD3QCXjBJpiPl9KgkEkGXJbAkCUq4-Ur10EQ,4131
|
||||
jinja2/nativetypes.py,sha256=Ul__gtVw4xH-0qvUvnCNHedQeNDwmEuyLJztzzSPeRg,2753
|
||||
jinja2/nodes.py,sha256=Mk1oJPVgIjnQw9WOqILvcu3rLepcFZ0ahxQm2mbwDwc,31095
|
||||
jinja2/optimizer.py,sha256=gQLlMYzvQhluhzmAIFA1tXS0cwgWYOjprN-gTRcHVsc,1457
|
||||
jinja2/parser.py,sha256=fcfdqePNTNyvosIvczbytVA332qpsURvYnCGcjDHSkA,35660
|
||||
jinja2/runtime.py,sha256=0y-BRyIEZ9ltByL2Id6GpHe1oDRQAwNeQvI0SKobNMw,30618
|
||||
jinja2/sandbox.py,sha256=knayyUvXsZ-F0mk15mO2-ehK9gsw04UhB8td-iUOtLc,17127
|
||||
jinja2/tests.py,sha256=iO_Y-9Vo60zrVe1lMpSl5sKHqAxe2leZHC08OoZ8K24,4799
|
||||
jinja2/utils.py,sha256=OoVMlQe9S2-lWT6jJbTu9tDuDvGNyWUhHDcE51i5_Do,22522
|
||||
jinja2/visitor.py,sha256=DUHupl0a4PGp7nxRtZFttUzAi1ccxzqc2hzetPYUz8U,3240
|
||||
@@ -0,0 +1,6 @@
|
||||
Wheel-Version: 1.0
|
||||
Generator: bdist_wheel (0.34.2)
|
||||
Root-Is-Purelib: true
|
||||
Tag: py2-none-any
|
||||
Tag: py3-none-any
|
||||
|
||||
@@ -0,0 +1,3 @@
|
||||
[babel.extractors]
|
||||
jinja2 = jinja2.ext:babel_extract [i18n]
|
||||
|
||||
@@ -0,0 +1 @@
|
||||
jinja2
|
||||
@@ -0,0 +1 @@
|
||||
pip
|
||||
@@ -0,0 +1,28 @@
|
||||
Copyright 2010 Pallets
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are
|
||||
met:
|
||||
|
||||
1. Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
|
||||
2. Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in the
|
||||
documentation and/or other materials provided with the distribution.
|
||||
|
||||
3. Neither the name of the copyright holder nor the names of its
|
||||
contributors may be used to endorse or promote products derived from
|
||||
this software without specific prior written permission.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
|
||||
PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED
|
||||
TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
|
||||
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
|
||||
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
|
||||
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
@@ -0,0 +1,105 @@
|
||||
Metadata-Version: 2.1
|
||||
Name: MarkupSafe
|
||||
Version: 1.1.1
|
||||
Summary: Safely add untrusted strings to HTML/XML markup.
|
||||
Home-page: https://palletsprojects.com/p/markupsafe/
|
||||
Author: Armin Ronacher
|
||||
Author-email: armin.ronacher@active-4.com
|
||||
Maintainer: The Pallets Team
|
||||
Maintainer-email: contact@palletsprojects.com
|
||||
License: BSD-3-Clause
|
||||
Project-URL: Documentation, https://markupsafe.palletsprojects.com/
|
||||
Project-URL: Code, https://github.com/pallets/markupsafe
|
||||
Project-URL: Issue tracker, https://github.com/pallets/markupsafe/issues
|
||||
Platform: UNKNOWN
|
||||
Classifier: Development Status :: 5 - Production/Stable
|
||||
Classifier: Environment :: Web Environment
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: License :: OSI Approved :: BSD License
|
||||
Classifier: Operating System :: OS Independent
|
||||
Classifier: Programming Language :: Python
|
||||
Classifier: Programming Language :: Python :: 2
|
||||
Classifier: Programming Language :: Python :: 2.7
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: 3.4
|
||||
Classifier: Programming Language :: Python :: 3.5
|
||||
Classifier: Programming Language :: Python :: 3.6
|
||||
Classifier: Programming Language :: Python :: 3.7
|
||||
Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
|
||||
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
||||
Classifier: Topic :: Text Processing :: Markup :: HTML
|
||||
Requires-Python: >=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*
|
||||
Description-Content-Type: text/x-rst
|
||||
|
||||
MarkupSafe
|
||||
==========
|
||||
|
||||
MarkupSafe implements a text object that escapes characters so it is
|
||||
safe to use in HTML and XML. Characters that have special meanings are
|
||||
replaced so that they display as the actual characters. This mitigates
|
||||
injection attacks, meaning untrusted user input can safely be displayed
|
||||
on a page.
|
||||
|
||||
|
||||
Installing
|
||||
----------
|
||||
|
||||
Install and update using `pip`_:
|
||||
|
||||
.. code-block:: text
|
||||
|
||||
pip install -U MarkupSafe
|
||||
|
||||
.. _pip: https://pip.pypa.io/en/stable/quickstart/
|
||||
|
||||
|
||||
Examples
|
||||
--------
|
||||
|
||||
.. code-block:: pycon
|
||||
|
||||
>>> from markupsafe import Markup, escape
|
||||
>>> # escape replaces special characters and wraps in Markup
|
||||
>>> escape('<script>alert(document.cookie);</script>')
|
||||
Markup(u'<script>alert(document.cookie);</script>')
|
||||
>>> # wrap in Markup to mark text "safe" and prevent escaping
|
||||
>>> Markup('<strong>Hello</strong>')
|
||||
Markup('<strong>hello</strong>')
|
||||
>>> escape(Markup('<strong>Hello</strong>'))
|
||||
Markup('<strong>hello</strong>')
|
||||
>>> # Markup is a text subclass (str on Python 3, unicode on Python 2)
|
||||
>>> # methods and operators escape their arguments
|
||||
>>> template = Markup("Hello <em>%s</em>")
|
||||
>>> template % '"World"'
|
||||
Markup('Hello <em>"World"</em>')
|
||||
|
||||
|
||||
Donate
|
||||
------
|
||||
|
||||
The Pallets organization develops and supports MarkupSafe and other
|
||||
libraries that use it. In order to grow the community of contributors
|
||||
and users, and allow the maintainers to devote more time to the
|
||||
projects, `please donate today`_.
|
||||
|
||||
.. _please donate today: https://palletsprojects.com/donate
|
||||
|
||||
|
||||
Links
|
||||
-----
|
||||
|
||||
* Website: https://palletsprojects.com/p/markupsafe/
|
||||
* Documentation: https://markupsafe.palletsprojects.com/
|
||||
* License: `BSD-3-Clause <https://github.com/pallets/markupsafe/blob/master/LICENSE.rst>`_
|
||||
* Releases: https://pypi.org/project/MarkupSafe/
|
||||
* Code: https://github.com/pallets/markupsafe
|
||||
* Issue tracker: https://github.com/pallets/markupsafe/issues
|
||||
* Test status:
|
||||
|
||||
* Linux, Mac: https://travis-ci.org/pallets/markupsafe
|
||||
* Windows: https://ci.appveyor.com/project/pallets/markupsafe
|
||||
|
||||
* Test coverage: https://codecov.io/gh/pallets/markupsafe
|
||||
* Official chat: https://discord.gg/t6rrQZH
|
||||
|
||||
|
||||
@@ -0,0 +1,16 @@
|
||||
MarkupSafe-1.1.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
MarkupSafe-1.1.1.dist-info/LICENSE.txt,sha256=SJqOEQhQntmKN7uYPhHg9-HTHwvY-Zp5yESOf_N9B-o,1475
|
||||
MarkupSafe-1.1.1.dist-info/METADATA,sha256=IFCP4hCNGjXJgMoSvdjPiKDLAMUTTWoxKXQsQvmyMNU,3653
|
||||
MarkupSafe-1.1.1.dist-info/RECORD,,
|
||||
MarkupSafe-1.1.1.dist-info/WHEEL,sha256=VEyGcIFAmk_1KbI6gaZGw_mMiT-pdGweASQLX-DzYaY,108
|
||||
MarkupSafe-1.1.1.dist-info/top_level.txt,sha256=qy0Plje5IJuvsCBjejJyhDCjEAdcDLK_2agVcex8Z6U,11
|
||||
markupsafe/__init__.py,sha256=oTblO5f9KFM-pvnq9bB0HgElnqkJyqHnFN1Nx2NIvnY,10126
|
||||
markupsafe/__pycache__/__init__.cpython-38.pyc,,
|
||||
markupsafe/__pycache__/_compat.cpython-38.pyc,,
|
||||
markupsafe/__pycache__/_constants.cpython-38.pyc,,
|
||||
markupsafe/__pycache__/_native.cpython-38.pyc,,
|
||||
markupsafe/_compat.py,sha256=uEW1ybxEjfxIiuTbRRaJpHsPFf4yQUMMKaPgYEC5XbU,558
|
||||
markupsafe/_constants.py,sha256=zo2ajfScG-l1Sb_52EP3MlDCqO7Y1BVHUXXKRsVDRNk,4690
|
||||
markupsafe/_native.py,sha256=d-8S_zzYt2y512xYcuSxq0NeG2DUUvG80wVdTn-4KI8,1873
|
||||
markupsafe/_speedups.c,sha256=k0fzEIK3CP6MmMqeY0ob43TP90mVN0DTyn7BAl3RqSg,9884
|
||||
markupsafe/_speedups.cpython-38-x86_64-linux-gnu.so,sha256=SbJwN321Xn7OPYGv5a6Ghzga75uT8RHQUGkoQUASF-o,48016
|
||||
@@ -0,0 +1,5 @@
|
||||
Wheel-Version: 1.0
|
||||
Generator: bdist_wheel (0.31.1)
|
||||
Root-Is-Purelib: false
|
||||
Tag: cp38-cp38-manylinux1_x86_64
|
||||
|
||||
@@ -0,0 +1 @@
|
||||
markupsafe
|
||||
@@ -0,0 +1,39 @@
|
||||
Metadata-Version: 1.2
|
||||
Name: PyYAML
|
||||
Version: 5.3.1
|
||||
Summary: YAML parser and emitter for Python
|
||||
Home-page: https://github.com/yaml/pyyaml
|
||||
Author: Kirill Simonov
|
||||
Author-email: xi@resolvent.net
|
||||
License: MIT
|
||||
Download-URL: https://pypi.org/project/PyYAML/
|
||||
Description: YAML is a data serialization format designed for human readability
|
||||
and interaction with scripting languages. PyYAML is a YAML parser
|
||||
and emitter for Python.
|
||||
|
||||
PyYAML features a complete YAML 1.1 parser, Unicode support, pickle
|
||||
support, capable extension API, and sensible error messages. PyYAML
|
||||
supports standard YAML tags and provides Python-specific tags that
|
||||
allow to represent an arbitrary Python object.
|
||||
|
||||
PyYAML is applicable for a broad range of tasks from complex
|
||||
configuration files to object serialization and persistence.
|
||||
Platform: Any
|
||||
Classifier: Development Status :: 5 - Production/Stable
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: License :: OSI Approved :: MIT License
|
||||
Classifier: Operating System :: OS Independent
|
||||
Classifier: Programming Language :: Cython
|
||||
Classifier: Programming Language :: Python
|
||||
Classifier: Programming Language :: Python :: 2
|
||||
Classifier: Programming Language :: Python :: 2.7
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: 3.5
|
||||
Classifier: Programming Language :: Python :: 3.6
|
||||
Classifier: Programming Language :: Python :: 3.7
|
||||
Classifier: Programming Language :: Python :: 3.8
|
||||
Classifier: Programming Language :: Python :: Implementation :: CPython
|
||||
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
||||
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
||||
Classifier: Topic :: Text Processing :: Markup
|
||||
Requires-Python: >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*
|
||||
@@ -0,0 +1,29 @@
|
||||
LICENSE
|
||||
README
|
||||
setup.cfg
|
||||
setup.py
|
||||
ext/_yaml.c
|
||||
ext/_yaml.h
|
||||
ext/_yaml.pxd
|
||||
ext/_yaml.pyx
|
||||
lib3/PyYAML.egg-info/PKG-INFO
|
||||
lib3/PyYAML.egg-info/SOURCES.txt
|
||||
lib3/PyYAML.egg-info/dependency_links.txt
|
||||
lib3/PyYAML.egg-info/top_level.txt
|
||||
lib3/yaml/__init__.py
|
||||
lib3/yaml/composer.py
|
||||
lib3/yaml/constructor.py
|
||||
lib3/yaml/cyaml.py
|
||||
lib3/yaml/dumper.py
|
||||
lib3/yaml/emitter.py
|
||||
lib3/yaml/error.py
|
||||
lib3/yaml/events.py
|
||||
lib3/yaml/loader.py
|
||||
lib3/yaml/nodes.py
|
||||
lib3/yaml/parser.py
|
||||
lib3/yaml/reader.py
|
||||
lib3/yaml/representer.py
|
||||
lib3/yaml/resolver.py
|
||||
lib3/yaml/scanner.py
|
||||
lib3/yaml/serializer.py
|
||||
lib3/yaml/tokens.py
|
||||
@@ -0,0 +1 @@
|
||||
|
||||
@@ -0,0 +1,38 @@
|
||||
../yaml/__init__.py
|
||||
../yaml/__pycache__/__init__.cpython-38.pyc
|
||||
../yaml/__pycache__/composer.cpython-38.pyc
|
||||
../yaml/__pycache__/constructor.cpython-38.pyc
|
||||
../yaml/__pycache__/cyaml.cpython-38.pyc
|
||||
../yaml/__pycache__/dumper.cpython-38.pyc
|
||||
../yaml/__pycache__/emitter.cpython-38.pyc
|
||||
../yaml/__pycache__/error.cpython-38.pyc
|
||||
../yaml/__pycache__/events.cpython-38.pyc
|
||||
../yaml/__pycache__/loader.cpython-38.pyc
|
||||
../yaml/__pycache__/nodes.cpython-38.pyc
|
||||
../yaml/__pycache__/parser.cpython-38.pyc
|
||||
../yaml/__pycache__/reader.cpython-38.pyc
|
||||
../yaml/__pycache__/representer.cpython-38.pyc
|
||||
../yaml/__pycache__/resolver.cpython-38.pyc
|
||||
../yaml/__pycache__/scanner.cpython-38.pyc
|
||||
../yaml/__pycache__/serializer.cpython-38.pyc
|
||||
../yaml/__pycache__/tokens.cpython-38.pyc
|
||||
../yaml/composer.py
|
||||
../yaml/constructor.py
|
||||
../yaml/cyaml.py
|
||||
../yaml/dumper.py
|
||||
../yaml/emitter.py
|
||||
../yaml/error.py
|
||||
../yaml/events.py
|
||||
../yaml/loader.py
|
||||
../yaml/nodes.py
|
||||
../yaml/parser.py
|
||||
../yaml/reader.py
|
||||
../yaml/representer.py
|
||||
../yaml/resolver.py
|
||||
../yaml/scanner.py
|
||||
../yaml/serializer.py
|
||||
../yaml/tokens.py
|
||||
PKG-INFO
|
||||
SOURCES.txt
|
||||
dependency_links.txt
|
||||
top_level.txt
|
||||
@@ -0,0 +1,2 @@
|
||||
_yaml
|
||||
yaml
|
||||
@@ -0,0 +1 @@
|
||||
pip
|
||||
@@ -0,0 +1,28 @@
|
||||
Copyright 2007 Pallets
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are
|
||||
met:
|
||||
|
||||
1. Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
|
||||
2. Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in the
|
||||
documentation and/or other materials provided with the distribution.
|
||||
|
||||
3. Neither the name of the copyright holder nor the names of its
|
||||
contributors may be used to endorse or promote products derived from
|
||||
this software without specific prior written permission.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
|
||||
PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED
|
||||
TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
|
||||
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
|
||||
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
|
||||
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
@@ -0,0 +1,128 @@
|
||||
Metadata-Version: 2.1
|
||||
Name: Werkzeug
|
||||
Version: 1.0.1
|
||||
Summary: The comprehensive WSGI web application library.
|
||||
Home-page: https://palletsprojects.com/p/werkzeug/
|
||||
Author: Armin Ronacher
|
||||
Author-email: armin.ronacher@active-4.com
|
||||
Maintainer: Pallets
|
||||
Maintainer-email: contact@palletsprojects.com
|
||||
License: BSD-3-Clause
|
||||
Project-URL: Documentation, https://werkzeug.palletsprojects.com/
|
||||
Project-URL: Code, https://github.com/pallets/werkzeug
|
||||
Project-URL: Issue tracker, https://github.com/pallets/werkzeug/issues
|
||||
Platform: UNKNOWN
|
||||
Classifier: Development Status :: 5 - Production/Stable
|
||||
Classifier: Environment :: Web Environment
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: License :: OSI Approved :: BSD License
|
||||
Classifier: Operating System :: OS Independent
|
||||
Classifier: Programming Language :: Python
|
||||
Classifier: Programming Language :: Python :: 2
|
||||
Classifier: Programming Language :: Python :: 2.7
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: 3.5
|
||||
Classifier: Programming Language :: Python :: 3.6
|
||||
Classifier: Programming Language :: Python :: 3.7
|
||||
Classifier: Programming Language :: Python :: 3.8
|
||||
Classifier: Programming Language :: Python :: Implementation :: CPython
|
||||
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
||||
Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content
|
||||
Classifier: Topic :: Internet :: WWW/HTTP :: WSGI
|
||||
Classifier: Topic :: Internet :: WWW/HTTP :: WSGI :: Application
|
||||
Classifier: Topic :: Internet :: WWW/HTTP :: WSGI :: Middleware
|
||||
Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
|
||||
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
||||
Requires-Python: >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*
|
||||
Description-Content-Type: text/x-rst
|
||||
Provides-Extra: dev
|
||||
Requires-Dist: pytest ; extra == 'dev'
|
||||
Requires-Dist: pytest-timeout ; extra == 'dev'
|
||||
Requires-Dist: coverage ; extra == 'dev'
|
||||
Requires-Dist: tox ; extra == 'dev'
|
||||
Requires-Dist: sphinx ; extra == 'dev'
|
||||
Requires-Dist: pallets-sphinx-themes ; extra == 'dev'
|
||||
Requires-Dist: sphinx-issues ; extra == 'dev'
|
||||
Provides-Extra: watchdog
|
||||
Requires-Dist: watchdog ; extra == 'watchdog'
|
||||
|
||||
Werkzeug
|
||||
========
|
||||
|
||||
*werkzeug* German noun: "tool". Etymology: *werk* ("work"), *zeug* ("stuff")
|
||||
|
||||
Werkzeug is a comprehensive `WSGI`_ web application library. It began as
|
||||
a simple collection of various utilities for WSGI applications and has
|
||||
become one of the most advanced WSGI utility libraries.
|
||||
|
||||
It includes:
|
||||
|
||||
- An interactive debugger that allows inspecting stack traces and
|
||||
source code in the browser with an interactive interpreter for any
|
||||
frame in the stack.
|
||||
- A full-featured request object with objects to interact with
|
||||
headers, query args, form data, files, and cookies.
|
||||
- A response object that can wrap other WSGI applications and handle
|
||||
streaming data.
|
||||
- A routing system for matching URLs to endpoints and generating URLs
|
||||
for endpoints, with an extensible system for capturing variables
|
||||
from URLs.
|
||||
- HTTP utilities to handle entity tags, cache control, dates, user
|
||||
agents, cookies, files, and more.
|
||||
- A threaded WSGI server for use while developing applications
|
||||
locally.
|
||||
- A test client for simulating HTTP requests during testing without
|
||||
requiring running a server.
|
||||
|
||||
Werkzeug is Unicode aware and doesn't enforce any dependencies. It is up
|
||||
to the developer to choose a template engine, database adapter, and even
|
||||
how to handle requests. It can be used to build all sorts of end user
|
||||
applications such as blogs, wikis, or bulletin boards.
|
||||
|
||||
`Flask`_ wraps Werkzeug, using it to handle the details of WSGI while
|
||||
providing more structure and patterns for defining powerful
|
||||
applications.
|
||||
|
||||
|
||||
Installing
|
||||
----------
|
||||
|
||||
Install and update using `pip`_:
|
||||
|
||||
.. code-block:: text
|
||||
|
||||
pip install -U Werkzeug
|
||||
|
||||
|
||||
A Simple Example
|
||||
----------------
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from werkzeug.wrappers import Request, Response
|
||||
|
||||
@Request.application
|
||||
def application(request):
|
||||
return Response('Hello, World!')
|
||||
|
||||
if __name__ == '__main__':
|
||||
from werkzeug.serving import run_simple
|
||||
run_simple('localhost', 4000, application)
|
||||
|
||||
|
||||
Links
|
||||
-----
|
||||
|
||||
- Website: https://palletsprojects.com/p/werkzeug/
|
||||
- Documentation: https://werkzeug.palletsprojects.com/
|
||||
- Releases: https://pypi.org/project/Werkzeug/
|
||||
- Code: https://github.com/pallets/werkzeug
|
||||
- Issue tracker: https://github.com/pallets/werkzeug/issues
|
||||
- Test status: https://dev.azure.com/pallets/werkzeug/_build
|
||||
- Official chat: https://discord.gg/t6rrQZH
|
||||
|
||||
.. _WSGI: https://wsgi.readthedocs.io/en/latest/
|
||||
.. _Flask: https://www.palletsprojects.com/p/flask/
|
||||
.. _pip: https://pip.pypa.io/en/stable/quickstart/
|
||||
|
||||
|
||||
@@ -0,0 +1,101 @@
|
||||
Werkzeug-1.0.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
Werkzeug-1.0.1.dist-info/LICENSE.rst,sha256=O0nc7kEF6ze6wQ-vG-JgQI_oXSUrjp3y4JefweCUQ3s,1475
|
||||
Werkzeug-1.0.1.dist-info/METADATA,sha256=d0zmVNa4UC2-nAo2A8_81oiy123D6JTGRSuY_Ymgyt4,4730
|
||||
Werkzeug-1.0.1.dist-info/RECORD,,
|
||||
Werkzeug-1.0.1.dist-info/WHEEL,sha256=kGT74LWyRUZrL4VgLh6_g12IeVl_9u9ZVhadrgXZUEY,110
|
||||
Werkzeug-1.0.1.dist-info/top_level.txt,sha256=QRyj2VjwJoQkrwjwFIOlB8Xg3r9un0NtqVHQF-15xaw,9
|
||||
werkzeug/__init__.py,sha256=rb-yPiXOjTLbtDOl5fQp5hN7oBdaoXAoQ-slAAvfZAo,502
|
||||
werkzeug/__pycache__/__init__.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/_compat.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/_internal.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/_reloader.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/datastructures.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/exceptions.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/filesystem.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/formparser.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/http.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/local.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/posixemulation.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/routing.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/security.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/serving.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/test.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/testapp.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/urls.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/useragents.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/utils.cpython-38.pyc,,
|
||||
werkzeug/__pycache__/wsgi.cpython-38.pyc,,
|
||||
werkzeug/_compat.py,sha256=zjufTNrhQ8BgYSGSh-sVu6iW3r3O9WzjE9j-qJobx-g,6671
|
||||
werkzeug/_internal.py,sha256=d_4AqheyS6dHMViwdc0drFrjs67ZzT6Ej2gWf-Z-Iys,14351
|
||||
werkzeug/_reloader.py,sha256=I3mg3oRQ0lLzl06oEoVopN3bN7CtINuuUQdqDcmTnEs,11531
|
||||
werkzeug/datastructures.py,sha256=AonxOcwU0TPMEzfKF1368ySULxHgxE-JE-DEAGdo2ts,100480
|
||||
werkzeug/debug/__init__.py,sha256=3RtUMc5Y9hYyK11ugHltgkQ9Dt-ViR945Vy_X5NV7zU,17289
|
||||
werkzeug/debug/__pycache__/__init__.cpython-38.pyc,,
|
||||
werkzeug/debug/__pycache__/console.cpython-38.pyc,,
|
||||
werkzeug/debug/__pycache__/repr.cpython-38.pyc,,
|
||||
werkzeug/debug/__pycache__/tbtools.cpython-38.pyc,,
|
||||
werkzeug/debug/console.py,sha256=OATaO7KHYMqpbzIFe1HeW9Mnl3wZgA3jMQoGDPn5URc,5488
|
||||
werkzeug/debug/repr.py,sha256=lIwuhbyrMwVe3P_cFqNyqzHL7P93TLKod7lw9clydEw,9621
|
||||
werkzeug/debug/shared/FONT_LICENSE,sha256=LwAVEI1oYnvXiNMT9SnCH_TaLCxCpeHziDrMg0gPkAI,4673
|
||||
werkzeug/debug/shared/console.png,sha256=bxax6RXXlvOij_KeqvSNX0ojJf83YbnZ7my-3Gx9w2A,507
|
||||
werkzeug/debug/shared/debugger.js,sha256=rOhqZMRfpZnnu6_XCGn6wMWPhtfwRAcyZKksdIxPJas,6400
|
||||
werkzeug/debug/shared/jquery.js,sha256=CSXorXvZcTkaix6Yvo6HppcZGetbYMGWSFlBw8HfCJo,88145
|
||||
werkzeug/debug/shared/less.png,sha256=-4-kNRaXJSONVLahrQKUxMwXGm9R4OnZ9SxDGpHlIR4,191
|
||||
werkzeug/debug/shared/more.png,sha256=GngN7CioHQoV58rH6ojnkYi8c_qED2Aka5FO5UXrReY,200
|
||||
werkzeug/debug/shared/source.png,sha256=RoGcBTE4CyCB85GBuDGTFlAnUqxwTBiIfDqW15EpnUQ,818
|
||||
werkzeug/debug/shared/style.css,sha256=gZ9uhmb5zj3XLuT9RvnMp6jMINgQ-VVBCp-2AZbG3YQ,6604
|
||||
werkzeug/debug/shared/ubuntu.ttf,sha256=1eaHFyepmy4FyDvjLVzpITrGEBu_CZYY94jE0nED1c0,70220
|
||||
werkzeug/debug/tbtools.py,sha256=2iJ8RURUZUSbopOIehy53LnVJWx47lsHN2V2l6hc7Wc,20363
|
||||
werkzeug/exceptions.py,sha256=UTYSDkmAsH-vt8VSidlEffwqBVNXuT7bRg-_NqgUe8A,25188
|
||||
werkzeug/filesystem.py,sha256=HzKl-j0Hd8Jl66j778UbPTAYNnY6vUZgYLlBZ0e7uw0,2101
|
||||
werkzeug/formparser.py,sha256=Sto0jZid9im9ZVIf56vilCdyX-arK33wSftkYsLCnzo,21788
|
||||
werkzeug/http.py,sha256=KVRV3yFK14PJeI56qClEq4qxFdvKUQVy4C_dwuWz9_Q,43107
|
||||
werkzeug/local.py,sha256=_Tk7gB238pPWUU7habxFkZF02fiCMRVW6d62YWL1Rh0,14371
|
||||
werkzeug/middleware/__init__.py,sha256=f1SFZo67IlW4k1uqKzNHxYQlsakUS-D6KK_j0e3jjwQ,549
|
||||
werkzeug/middleware/__pycache__/__init__.cpython-38.pyc,,
|
||||
werkzeug/middleware/__pycache__/dispatcher.cpython-38.pyc,,
|
||||
werkzeug/middleware/__pycache__/http_proxy.cpython-38.pyc,,
|
||||
werkzeug/middleware/__pycache__/lint.cpython-38.pyc,,
|
||||
werkzeug/middleware/__pycache__/profiler.cpython-38.pyc,,
|
||||
werkzeug/middleware/__pycache__/proxy_fix.cpython-38.pyc,,
|
||||
werkzeug/middleware/__pycache__/shared_data.cpython-38.pyc,,
|
||||
werkzeug/middleware/dispatcher.py,sha256=_-KoMzHtcISHS7ouWKAOraqlCLprdh83YOAn_8DjLp8,2240
|
||||
werkzeug/middleware/http_proxy.py,sha256=lRjTdMmghHiZuZrS7_UJ3gZc-vlFizhBbFZ-XZPLwIA,7117
|
||||
werkzeug/middleware/lint.py,sha256=ItTwuWJnflF8xMT1uqU_Ty1ryhux-CjeUfskqaUpxsw,12967
|
||||
werkzeug/middleware/profiler.py,sha256=8B_s23d6BGrU_q54gJsm6kcCbOJbTSqrXCsioHON0Xs,4471
|
||||
werkzeug/middleware/proxy_fix.py,sha256=K5oZ3DPXOzdZi0Xba5zW7ClPOxgUuqXHQHvY2-AWCGw,6431
|
||||
werkzeug/middleware/shared_data.py,sha256=sPSRTKqtKSVBUyN8fr6jOJbdq9cdOLu6pg3gz4Y_1Xo,9599
|
||||
werkzeug/posixemulation.py,sha256=gSSiv1SCmOyzOM_nq1ZaZCtxP__C5MeDJl_4yXJmi4Q,3541
|
||||
werkzeug/routing.py,sha256=6-iZ7CKeUILYAehoKXLbmi5E6LgLbwuzUh8TNplnf5Q,79019
|
||||
werkzeug/security.py,sha256=81149MplFq7-hD4RK4sKp9kzXXejjV9D4lWBzaRyeQ8,8106
|
||||
werkzeug/serving.py,sha256=YvTqvurA-Mnj8mkqRe2kBdVr2ap4ibCq1ByQjOA6g1w,38694
|
||||
werkzeug/test.py,sha256=GJ9kxTMSJ-nB7kfGtxuROr9JGmXxDRev-2U1SkeUJGE,39564
|
||||
werkzeug/testapp.py,sha256=bHekqMsqRfVxwgFbvOMem-DYa_sdB7R47yUXpt1RUTo,9329
|
||||
werkzeug/urls.py,sha256=T8-hV_1vwhu6xhX93FwsHteK-W-kIE2orj5WoMf-WFw,39322
|
||||
werkzeug/useragents.py,sha256=TSoGv5IOvP375eK5gLLpsLQCeUgTR6sO1WftmAP_YvM,5563
|
||||
werkzeug/utils.py,sha256=hrVK4u_wi8z9viBO9bgOLlm1aaIvCpn-p2d1FeZQDEo,25251
|
||||
werkzeug/wrappers/__init__.py,sha256=S4VioKAmF_av9Ec9zQvG71X1EOkYfPx1TYck9jyDiyY,1384
|
||||
werkzeug/wrappers/__pycache__/__init__.cpython-38.pyc,,
|
||||
werkzeug/wrappers/__pycache__/accept.cpython-38.pyc,,
|
||||
werkzeug/wrappers/__pycache__/auth.cpython-38.pyc,,
|
||||
werkzeug/wrappers/__pycache__/base_request.cpython-38.pyc,,
|
||||
werkzeug/wrappers/__pycache__/base_response.cpython-38.pyc,,
|
||||
werkzeug/wrappers/__pycache__/common_descriptors.cpython-38.pyc,,
|
||||
werkzeug/wrappers/__pycache__/cors.cpython-38.pyc,,
|
||||
werkzeug/wrappers/__pycache__/etag.cpython-38.pyc,,
|
||||
werkzeug/wrappers/__pycache__/json.cpython-38.pyc,,
|
||||
werkzeug/wrappers/__pycache__/request.cpython-38.pyc,,
|
||||
werkzeug/wrappers/__pycache__/response.cpython-38.pyc,,
|
||||
werkzeug/wrappers/__pycache__/user_agent.cpython-38.pyc,,
|
||||
werkzeug/wrappers/accept.py,sha256=TIvjUc0g73fhTWX54wg_D9NNzKvpnG1X8u1w26tK1o8,1760
|
||||
werkzeug/wrappers/auth.py,sha256=Pmn6iaGHBrUyHbJpW0lZhO_q9RVoAa5QalaTqcavdAI,1158
|
||||
werkzeug/wrappers/base_request.py,sha256=4TuGlKWeKQdlq4eU94hJYcXSfWo8Rk7CS1Ef5lJ3ZM0,26012
|
||||
werkzeug/wrappers/base_response.py,sha256=JTxJZ8o-IBetpoWJqt2HFwPaNWNDAlM3_GXJe1Whw80,27784
|
||||
werkzeug/wrappers/common_descriptors.py,sha256=X2Ktd5zUWsmcd4ciaF62Dd8Lru9pLGP_XDUNukc8cXs,12829
|
||||
werkzeug/wrappers/cors.py,sha256=XMbaCol4dWTGvb-dCJBoN0p3JX91v93AIAHd7tnB3L4,3466
|
||||
werkzeug/wrappers/etag.py,sha256=XMXtyfByBsOjxwaX8U7ZtUY7JXkbQLP45oXZ0qkyTNs,12217
|
||||
werkzeug/wrappers/json.py,sha256=HvK_A4NpO0sLqgb10sTJcoZydYOwyNiPCJPV7SVgcgE,4343
|
||||
werkzeug/wrappers/request.py,sha256=QbHGqDpGPN684pnOPEokwkPESfm-NnfYM7ydOMxW_NI,1514
|
||||
werkzeug/wrappers/response.py,sha256=Oqv8TMG_dnOKTq_V30ddgkO5B7IJhkVPODvm7cbhZ3c,2524
|
||||
werkzeug/wrappers/user_agent.py,sha256=YJb-vr12cujG7sQMG9V89VsJa-03SWSenhg1W4cT0EY,435
|
||||
werkzeug/wsgi.py,sha256=ZGk85NzRyQTzkYis-xl8V9ydJgfClBdStvhzDzER2mw,34367
|
||||
@@ -0,0 +1,6 @@
|
||||
Wheel-Version: 1.0
|
||||
Generator: bdist_wheel (0.34.2)
|
||||
Root-Is-Purelib: true
|
||||
Tag: py2-none-any
|
||||
Tag: py3-none-any
|
||||
|
||||
@@ -0,0 +1 @@
|
||||
werkzeug
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,138 @@
|
||||
aiofiles: file support for asyncio
|
||||
==================================
|
||||
|
||||
.. image:: https://img.shields.io/pypi/v/aiofiles.svg
|
||||
:target: https://pypi.python.org/pypi/aiofiles
|
||||
|
||||
.. image:: https://travis-ci.org/Tinche/aiofiles.svg?branch=master
|
||||
:target: https://travis-ci.org/Tinche/aiofiles
|
||||
|
||||
.. image:: https://codecov.io/gh/Tinche/aiofiles/branch/master/graph/badge.svg
|
||||
:target: https://codecov.io/gh/Tinche/aiofiles
|
||||
|
||||
**aiofiles** is an Apache2 licensed library, written in Python, for handling local
|
||||
disk files in asyncio applications.
|
||||
|
||||
Ordinary local file IO is blocking, and cannot easily and portably made
|
||||
asynchronous. This means doing file IO may interfere with asyncio applications,
|
||||
which shouldn't block the executing thread. aiofiles helps with this by
|
||||
introducing asynchronous versions of files that support delegating operations to
|
||||
a separate thread pool.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
async with aiofiles.open('filename', mode='r') as f:
|
||||
contents = await f.read()
|
||||
print(contents)
|
||||
'My file contents'
|
||||
|
||||
Asynchronous iteration is also supported.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
async with aiofiles.open('filename') as f:
|
||||
async for line in f:
|
||||
...
|
||||
|
||||
Features
|
||||
--------
|
||||
|
||||
- a file API very similar to Python's standard, blocking API
|
||||
- support for buffered and unbuffered binary files, and buffered text files
|
||||
- support for ``async``/``await`` (:PEP:`492`) constructs
|
||||
|
||||
|
||||
Installation
|
||||
------------
|
||||
|
||||
To install aiofiles, simply:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
$ pip install aiofiles
|
||||
|
||||
Usage
|
||||
-----
|
||||
|
||||
Files are opened using the ``aiofiles.open()`` coroutine, which in addition to
|
||||
mirroring the builtin ``open`` accepts optional ``loop`` and ``executor``
|
||||
arguments. If ``loop`` is absent, the default loop will be used, as per the
|
||||
set asyncio policy. If ``executor`` is not specified, the default event loop
|
||||
executor will be used.
|
||||
|
||||
In case of success, an asynchronous file object is returned with an
|
||||
API identical to an ordinary file, except the following methods are coroutines
|
||||
and delegate to an executor:
|
||||
|
||||
* ``close``
|
||||
* ``flush``
|
||||
* ``isatty``
|
||||
* ``read``
|
||||
* ``readall``
|
||||
* ``read1``
|
||||
* ``readinto``
|
||||
* ``readline``
|
||||
* ``readlines``
|
||||
* ``seek``
|
||||
* ``seekable``
|
||||
* ``tell``
|
||||
* ``truncate``
|
||||
* ``writable``
|
||||
* ``write``
|
||||
* ``writelines``
|
||||
|
||||
In case of failure, one of the usual exceptions will be raised.
|
||||
|
||||
The ``aiofiles.os`` module contains executor-enabled coroutine versions of
|
||||
several useful ``os`` functions that deal with files:
|
||||
|
||||
* ``stat``
|
||||
* ``sendfile``
|
||||
|
||||
Writing tests for aiofiles
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Real file IO can be mocked by patching ``aiofiles.threadpool.sync_open``
|
||||
as desired. The return type also needs to be registered with the
|
||||
``aiofiles.threadpool.wrap`` dispatcher:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
aiofiles.threadpool.wrap.register(mock.MagicMock)(
|
||||
lambda *args, **kwargs: threadpool.AsyncBufferedIOBase(*args, **kwargs))
|
||||
|
||||
async def test_stuff():
|
||||
data = 'data'
|
||||
mock_file = mock.MagicMock()
|
||||
|
||||
with mock.patch('aiofiles.threadpool.sync_open', return_value=mock_file) as mock_open:
|
||||
async with aiofiles.open('filename', 'w') as f:
|
||||
await f.write(data)
|
||||
|
||||
mock_file.write.assert_called_once_with(data)
|
||||
|
||||
History
|
||||
~~~~~~~
|
||||
|
||||
0.4.0 (2018-08-11)
|
||||
``````````````````
|
||||
- Python 3.7 support.
|
||||
- Removed Python 3.3/3.4 support. If you use these versions, stick to aiofiles 0.3.x.
|
||||
|
||||
0.3.2 (2017-09-23)
|
||||
``````````````````
|
||||
- The LICENSE is now included in the sdist.
|
||||
`#31 <https://github.com/Tinche/aiofiles/pull/31>`_
|
||||
|
||||
0.3.1 (2017-03-10)
|
||||
``````````````````
|
||||
|
||||
- Introduced a changelog.
|
||||
- ``aiofiles.os.sendfile`` will now work if the standard ``os`` module contains a ``sendfile`` function.
|
||||
|
||||
Contributing
|
||||
~~~~~~~~~~~~
|
||||
Contributions are very welcome. Tests can be run with ``tox``, please ensure
|
||||
the coverage at least stays the same before you submit a pull request.
|
||||
|
||||
|
||||
@@ -0,0 +1 @@
|
||||
pip
|
||||
@@ -0,0 +1,155 @@
|
||||
Metadata-Version: 2.0
|
||||
Name: aiofiles
|
||||
Version: 0.4.0
|
||||
Summary: File support for asyncio.
|
||||
Home-page: https://github.com/Tinche/aiofiles
|
||||
Author: Tin Tvrtkovic
|
||||
Author-email: tinchester@gmail.com
|
||||
License: Apache 2.0
|
||||
Platform: UNKNOWN
|
||||
Classifier: Development Status :: 4 - Beta
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: License :: OSI Approved :: Apache Software License
|
||||
Classifier: Programming Language :: Python :: 3.5
|
||||
Classifier: Programming Language :: Python :: 3.6
|
||||
Classifier: Programming Language :: Python :: 3.7
|
||||
Classifier: Topic :: System :: Filesystems
|
||||
|
||||
aiofiles: file support for asyncio
|
||||
==================================
|
||||
|
||||
.. image:: https://img.shields.io/pypi/v/aiofiles.svg
|
||||
:target: https://pypi.python.org/pypi/aiofiles
|
||||
|
||||
.. image:: https://travis-ci.org/Tinche/aiofiles.svg?branch=master
|
||||
:target: https://travis-ci.org/Tinche/aiofiles
|
||||
|
||||
.. image:: https://codecov.io/gh/Tinche/aiofiles/branch/master/graph/badge.svg
|
||||
:target: https://codecov.io/gh/Tinche/aiofiles
|
||||
|
||||
**aiofiles** is an Apache2 licensed library, written in Python, for handling local
|
||||
disk files in asyncio applications.
|
||||
|
||||
Ordinary local file IO is blocking, and cannot easily and portably made
|
||||
asynchronous. This means doing file IO may interfere with asyncio applications,
|
||||
which shouldn't block the executing thread. aiofiles helps with this by
|
||||
introducing asynchronous versions of files that support delegating operations to
|
||||
a separate thread pool.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
async with aiofiles.open('filename', mode='r') as f:
|
||||
contents = await f.read()
|
||||
print(contents)
|
||||
'My file contents'
|
||||
|
||||
Asynchronous iteration is also supported.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
async with aiofiles.open('filename') as f:
|
||||
async for line in f:
|
||||
...
|
||||
|
||||
Features
|
||||
--------
|
||||
|
||||
- a file API very similar to Python's standard, blocking API
|
||||
- support for buffered and unbuffered binary files, and buffered text files
|
||||
- support for ``async``/``await`` (:PEP:`492`) constructs
|
||||
|
||||
|
||||
Installation
|
||||
------------
|
||||
|
||||
To install aiofiles, simply:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
$ pip install aiofiles
|
||||
|
||||
Usage
|
||||
-----
|
||||
|
||||
Files are opened using the ``aiofiles.open()`` coroutine, which in addition to
|
||||
mirroring the builtin ``open`` accepts optional ``loop`` and ``executor``
|
||||
arguments. If ``loop`` is absent, the default loop will be used, as per the
|
||||
set asyncio policy. If ``executor`` is not specified, the default event loop
|
||||
executor will be used.
|
||||
|
||||
In case of success, an asynchronous file object is returned with an
|
||||
API identical to an ordinary file, except the following methods are coroutines
|
||||
and delegate to an executor:
|
||||
|
||||
* ``close``
|
||||
* ``flush``
|
||||
* ``isatty``
|
||||
* ``read``
|
||||
* ``readall``
|
||||
* ``read1``
|
||||
* ``readinto``
|
||||
* ``readline``
|
||||
* ``readlines``
|
||||
* ``seek``
|
||||
* ``seekable``
|
||||
* ``tell``
|
||||
* ``truncate``
|
||||
* ``writable``
|
||||
* ``write``
|
||||
* ``writelines``
|
||||
|
||||
In case of failure, one of the usual exceptions will be raised.
|
||||
|
||||
The ``aiofiles.os`` module contains executor-enabled coroutine versions of
|
||||
several useful ``os`` functions that deal with files:
|
||||
|
||||
* ``stat``
|
||||
* ``sendfile``
|
||||
|
||||
Writing tests for aiofiles
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Real file IO can be mocked by patching ``aiofiles.threadpool.sync_open``
|
||||
as desired. The return type also needs to be registered with the
|
||||
``aiofiles.threadpool.wrap`` dispatcher:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
aiofiles.threadpool.wrap.register(mock.MagicMock)(
|
||||
lambda *args, **kwargs: threadpool.AsyncBufferedIOBase(*args, **kwargs))
|
||||
|
||||
async def test_stuff():
|
||||
data = 'data'
|
||||
mock_file = mock.MagicMock()
|
||||
|
||||
with mock.patch('aiofiles.threadpool.sync_open', return_value=mock_file) as mock_open:
|
||||
async with aiofiles.open('filename', 'w') as f:
|
||||
await f.write(data)
|
||||
|
||||
mock_file.write.assert_called_once_with(data)
|
||||
|
||||
History
|
||||
~~~~~~~
|
||||
|
||||
0.4.0 (2018-08-11)
|
||||
``````````````````
|
||||
- Python 3.7 support.
|
||||
- Removed Python 3.3/3.4 support. If you use these versions, stick to aiofiles 0.3.x.
|
||||
|
||||
0.3.2 (2017-09-23)
|
||||
``````````````````
|
||||
- The LICENSE is now included in the sdist.
|
||||
`#31 <https://github.com/Tinche/aiofiles/pull/31>`_
|
||||
|
||||
0.3.1 (2017-03-10)
|
||||
``````````````````
|
||||
|
||||
- Introduced a changelog.
|
||||
- ``aiofiles.os.sendfile`` will now work if the standard ``os`` module contains a ``sendfile`` function.
|
||||
|
||||
Contributing
|
||||
~~~~~~~~~~~~
|
||||
Contributions are very welcome. Tests can be run with ``tox``, please ensure
|
||||
the coverage at least stays the same before you submit a pull request.
|
||||
|
||||
|
||||
@@ -0,0 +1,23 @@
|
||||
aiofiles-0.4.0.dist-info/DESCRIPTION.rst,sha256=H1Vj_rqfRMCdJcU1lW_sPSIWr185N97Kn594gYK_B1A,3873
|
||||
aiofiles-0.4.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
aiofiles-0.4.0.dist-info/METADATA,sha256=RqoqpyBnW1j0u1t3w9jQ_F-0vZWsepXvs0MA-GFoMsM,4445
|
||||
aiofiles-0.4.0.dist-info/RECORD,,
|
||||
aiofiles-0.4.0.dist-info/WHEEL,sha256=8Lm45v9gcYRm70DrgFGVe4WsUtUMi1_0Tso1hqPGMjA,92
|
||||
aiofiles-0.4.0.dist-info/metadata.json,sha256=0eE-SIzvVCI2-nBAir1RKYLcPpxWa1uzpmAyD9Vye9M,712
|
||||
aiofiles-0.4.0.dist-info/top_level.txt,sha256=sskrEAT1Ocyj9qsJIoeIQNAijBFwY2L0nqayXghOSI0,9
|
||||
aiofiles/__init__.py,sha256=EqOSYq0u50pdIbUzDXqyeImxnHUmT9AtbRi_PtO5Hyw,123
|
||||
aiofiles/__pycache__/__init__.cpython-38.pyc,,
|
||||
aiofiles/__pycache__/_compat.cpython-38.pyc,,
|
||||
aiofiles/__pycache__/base.cpython-38.pyc,,
|
||||
aiofiles/__pycache__/os.cpython-38.pyc,,
|
||||
aiofiles/_compat.py,sha256=dk34urK9pKm1iqKNp2jKDWatAbn3Tt18eRn31G9BcAI,205
|
||||
aiofiles/base.py,sha256=OFBt0XoRATidR8UeVldUZmul6-jgRNTn6qebAgFoMys,2121
|
||||
aiofiles/os.py,sha256=zNYhMoGytKY994XPCqetbC4ScLgmZjJCjQfZNUm2NrU,514
|
||||
aiofiles/threadpool/__init__.py,sha256=io-WG10ohdQU6CdPZEV24oBssL_9T6NEf6AEAHA18Ao,2144
|
||||
aiofiles/threadpool/__pycache__/__init__.cpython-38.pyc,,
|
||||
aiofiles/threadpool/__pycache__/binary.cpython-38.pyc,,
|
||||
aiofiles/threadpool/__pycache__/text.cpython-38.pyc,,
|
||||
aiofiles/threadpool/__pycache__/utils.cpython-38.pyc,,
|
||||
aiofiles/threadpool/binary.py,sha256=Ds7je-noWGqtwBOd21OF7DNV1m9VFt3Z29ynS7hGf-k,1102
|
||||
aiofiles/threadpool/text.py,sha256=blq1hfMSQ_kEtDMuRwf-CtHNeOAwdgIOYYVDtFJi8CI,629
|
||||
aiofiles/threadpool/utils.py,sha256=8apQJirPwOgUeRUbT2ghDpqmiXaUq_ZJ5eQRklHgz3U,1266
|
||||
@@ -0,0 +1,5 @@
|
||||
Wheel-Version: 1.0
|
||||
Generator: bdist_wheel (0.30.0)
|
||||
Root-Is-Purelib: true
|
||||
Tag: py3-none-any
|
||||
|
||||
@@ -0,0 +1 @@
|
||||
{"classifiers": ["Development Status :: 4 - Beta", "Intended Audience :: Developers", "License :: OSI Approved :: Apache Software License", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Topic :: System :: Filesystems"], "extensions": {"python.details": {"contacts": [{"email": "tinchester@gmail.com", "name": "Tin Tvrtkovic", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://github.com/Tinche/aiofiles"}}}, "generator": "bdist_wheel (0.30.0)", "license": "Apache 2.0", "metadata_version": "2.0", "name": "aiofiles", "summary": "File support for asyncio.", "version": "0.4.0"}
|
||||
@@ -0,0 +1 @@
|
||||
aiofiles
|
||||
@@ -0,0 +1,6 @@
|
||||
"""Utilities for asyncio-friendly file handling."""
|
||||
from .threadpool import open
|
||||
|
||||
__version__ = "0.4.0"
|
||||
|
||||
__all__ = (open,)
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,8 @@
|
||||
import sys
|
||||
|
||||
try:
|
||||
from functools import singledispatch
|
||||
except ImportError: # pragma: nocover
|
||||
from singledispatch import singledispatch
|
||||
|
||||
PY_35 = sys.version_info >= (3, 5)
|
||||
@@ -0,0 +1,93 @@
|
||||
"""Various base classes."""
|
||||
import asyncio
|
||||
from collections.abc import Coroutine
|
||||
|
||||
|
||||
class AsyncBase:
|
||||
def __init__(self, file, loop, executor):
|
||||
self._file = file
|
||||
self._loop = loop
|
||||
self._executor = executor
|
||||
|
||||
def __aiter__(self):
|
||||
"""We are our own iterator."""
|
||||
return self
|
||||
|
||||
@asyncio.coroutine
|
||||
def __anext__(self):
|
||||
"""Simulate normal file iteration."""
|
||||
line = yield from self.readline()
|
||||
if line:
|
||||
return line
|
||||
else:
|
||||
raise StopAsyncIteration
|
||||
|
||||
|
||||
class _ContextManager(Coroutine):
|
||||
__slots__ = ('_coro', '_obj')
|
||||
|
||||
def __init__(self, coro):
|
||||
self._coro = coro
|
||||
self._obj = None
|
||||
|
||||
def send(self, value):
|
||||
return self._coro.send(value)
|
||||
|
||||
def throw(self, typ, val=None, tb=None):
|
||||
if val is None:
|
||||
return self._coro.throw(typ)
|
||||
elif tb is None:
|
||||
return self._coro.throw(typ, val)
|
||||
else:
|
||||
return self._coro.throw(typ, val, tb)
|
||||
|
||||
def close(self):
|
||||
return self._coro.close()
|
||||
|
||||
@property
|
||||
def gi_frame(self):
|
||||
return self._coro.gi_frame
|
||||
|
||||
@property
|
||||
def gi_running(self):
|
||||
return self._coro.gi_running
|
||||
|
||||
@property
|
||||
def gi_code(self):
|
||||
return self._coro.gi_code
|
||||
|
||||
def __next__(self):
|
||||
return self.send(None)
|
||||
|
||||
@asyncio.coroutine
|
||||
def __iter__(self):
|
||||
resp = yield from self._coro
|
||||
return resp
|
||||
|
||||
def __await__(self):
|
||||
resp = yield from self._coro
|
||||
return resp
|
||||
|
||||
@asyncio.coroutine
|
||||
def __anext__(self):
|
||||
resp = yield from self._coro
|
||||
return resp
|
||||
|
||||
@asyncio.coroutine
|
||||
def __aenter__(self):
|
||||
self._obj = yield from self._coro
|
||||
return self._obj
|
||||
|
||||
@asyncio.coroutine
|
||||
def __aexit__(self, exc_type, exc, tb):
|
||||
self._obj.close()
|
||||
self._obj = None
|
||||
|
||||
|
||||
class AiofilesContextManager(_ContextManager):
|
||||
"""An adjusted async context manager for aiofiles."""
|
||||
|
||||
@asyncio.coroutine
|
||||
def __aexit__(self, exc_type, exc_val, exc_tb):
|
||||
yield from self._obj.close()
|
||||
self._obj = None
|
||||
22
python3-vckonline/lib/python3.8/site-packages/aiofiles/os.py
Normal file
22
python3-vckonline/lib/python3.8/site-packages/aiofiles/os.py
Normal file
@@ -0,0 +1,22 @@
|
||||
"""Async executor versions of file functions from the os module."""
|
||||
import asyncio
|
||||
from functools import partial, wraps
|
||||
import os
|
||||
|
||||
|
||||
def wrap(func):
|
||||
@asyncio.coroutine
|
||||
@wraps(func)
|
||||
def run(*args, loop=None, executor=None, **kwargs):
|
||||
if loop is None:
|
||||
loop = asyncio.get_event_loop()
|
||||
pfunc = partial(func, *args, **kwargs)
|
||||
return loop.run_in_executor(executor, pfunc)
|
||||
|
||||
return run
|
||||
|
||||
|
||||
stat = wrap(os.stat)
|
||||
|
||||
if hasattr(os, "sendfile"):
|
||||
sendfile = wrap(os.sendfile)
|
||||
@@ -0,0 +1,63 @@
|
||||
"""Handle files using a thread pool executor."""
|
||||
import asyncio
|
||||
|
||||
from io import (FileIO, TextIOBase, BufferedReader, BufferedWriter,
|
||||
BufferedRandom)
|
||||
from functools import partial, singledispatch
|
||||
|
||||
from .binary import AsyncBufferedIOBase, AsyncBufferedReader, AsyncFileIO
|
||||
from .text import AsyncTextIOWrapper
|
||||
from ..base import AiofilesContextManager
|
||||
|
||||
sync_open = open
|
||||
|
||||
__all__ = ('open', )
|
||||
|
||||
|
||||
def open(file, mode='r', buffering=-1, encoding=None, errors=None, newline=None,
|
||||
closefd=True, opener=None, *, loop=None, executor=None):
|
||||
return AiofilesContextManager(_open(file, mode=mode, buffering=buffering,
|
||||
encoding=encoding, errors=errors,
|
||||
newline=newline, closefd=closefd,
|
||||
opener=opener, loop=loop,
|
||||
executor=executor))
|
||||
|
||||
|
||||
@asyncio.coroutine
|
||||
def _open(file, mode='r', buffering=-1, encoding=None, errors=None, newline=None,
|
||||
closefd=True, opener=None, *, loop=None, executor=None):
|
||||
"""Open an asyncio file."""
|
||||
if loop is None:
|
||||
loop = asyncio.get_event_loop()
|
||||
cb = partial(sync_open, file, mode=mode, buffering=buffering,
|
||||
encoding=encoding, errors=errors, newline=newline,
|
||||
closefd=closefd, opener=opener)
|
||||
f = yield from loop.run_in_executor(executor, cb)
|
||||
|
||||
return wrap(f, loop=loop, executor=executor)
|
||||
|
||||
|
||||
@singledispatch
|
||||
def wrap(file, *, loop=None, executor=None):
|
||||
raise TypeError('Unsupported io type: {}.'.format(file))
|
||||
|
||||
|
||||
@wrap.register(TextIOBase)
|
||||
def _(file, *, loop=None, executor=None):
|
||||
return AsyncTextIOWrapper(file, loop=loop, executor=executor)
|
||||
|
||||
|
||||
@wrap.register(BufferedWriter)
|
||||
def _(file, *, loop=None, executor=None):
|
||||
return AsyncBufferedIOBase(file, loop=loop, executor=executor)
|
||||
|
||||
|
||||
@wrap.register(BufferedReader)
|
||||
@wrap.register(BufferedRandom)
|
||||
def _(file, *, loop=None, executor=None):
|
||||
return AsyncBufferedReader(file, loop=loop, executor=executor)
|
||||
|
||||
|
||||
@wrap.register(FileIO)
|
||||
def _(file, *, loop=None, executor=None):
|
||||
return AsyncFileIO(file, loop, executor)
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,26 @@
|
||||
from ..base import AsyncBase
|
||||
from .utils import (delegate_to_executor, proxy_property_directly,
|
||||
proxy_method_directly)
|
||||
|
||||
|
||||
@delegate_to_executor('close', 'flush', 'isatty', 'read', 'read1', 'readinto',
|
||||
'readline', 'readlines', 'seek', 'seekable', 'tell',
|
||||
'truncate', 'writable', 'write', 'writelines')
|
||||
@proxy_method_directly('detach', 'fileno', 'readable')
|
||||
@proxy_property_directly('closed', 'raw')
|
||||
class AsyncBufferedIOBase(AsyncBase):
|
||||
"""The asyncio executor version of io.BufferedWriter."""
|
||||
|
||||
|
||||
@delegate_to_executor('peek')
|
||||
class AsyncBufferedReader(AsyncBufferedIOBase):
|
||||
"""The asyncio executor version of io.BufferedReader and Random."""
|
||||
|
||||
|
||||
@delegate_to_executor('close', 'flush', 'isatty', 'read', 'readall', 'readinto',
|
||||
'readline', 'readlines', 'seek', 'seekable', 'tell',
|
||||
'truncate', 'writable', 'write', 'writelines')
|
||||
@proxy_method_directly('fileno', 'readable')
|
||||
@proxy_property_directly('closed')
|
||||
class AsyncFileIO(AsyncBase):
|
||||
"""The asyncio executor version of io.FileIO."""
|
||||
@@ -0,0 +1,13 @@
|
||||
from .utils import (delegate_to_executor, proxy_property_directly,
|
||||
proxy_method_directly)
|
||||
from ..base import AsyncBase
|
||||
|
||||
|
||||
@delegate_to_executor('close', 'flush', 'isatty', 'read', 'readable',
|
||||
'readline', 'readlines', 'seek', 'seekable', 'tell',
|
||||
'truncate', 'write', 'writable', 'writelines')
|
||||
@proxy_method_directly('detach', 'fileno', 'readable')
|
||||
@proxy_property_directly('buffer', 'closed', 'encoding', 'errors',
|
||||
'line_buffering', 'newlines')
|
||||
class AsyncTextIOWrapper(AsyncBase):
|
||||
"""The asyncio executor version of io.TextIOWrapper."""
|
||||
@@ -0,0 +1,49 @@
|
||||
import asyncio
|
||||
import functools
|
||||
|
||||
|
||||
def delegate_to_executor(*attrs):
|
||||
def cls_builder(cls):
|
||||
for attr_name in attrs:
|
||||
setattr(cls, attr_name, _make_delegate_method(attr_name))
|
||||
return cls
|
||||
return cls_builder
|
||||
|
||||
|
||||
def proxy_method_directly(*attrs):
|
||||
def cls_builder(cls):
|
||||
for attr_name in attrs:
|
||||
setattr(cls, attr_name, _make_proxy_method(attr_name))
|
||||
return cls
|
||||
|
||||
return cls_builder
|
||||
|
||||
|
||||
def proxy_property_directly(*attrs):
|
||||
def cls_builder(cls):
|
||||
for attr_name in attrs:
|
||||
setattr(cls, attr_name, _make_proxy_property(attr_name))
|
||||
return cls
|
||||
|
||||
return cls_builder
|
||||
|
||||
|
||||
def _make_delegate_method(attr_name):
|
||||
@asyncio.coroutine
|
||||
def method(self, *args, **kwargs):
|
||||
cb = functools.partial(getattr(self._file, attr_name),
|
||||
*args, **kwargs)
|
||||
return (yield from self._loop.run_in_executor(self._executor, cb))
|
||||
return method
|
||||
|
||||
|
||||
def _make_proxy_method(attr_name):
|
||||
def method(self, *args, **kwargs):
|
||||
return getattr(self._file, attr_name)(*args, **kwargs)
|
||||
return method
|
||||
|
||||
|
||||
def _make_proxy_property(attr_name):
|
||||
def proxy_property(self):
|
||||
return getattr(self._file, attr_name)
|
||||
return property(proxy_property)
|
||||
@@ -0,0 +1,624 @@
|
||||
Metadata-Version: 2.1
|
||||
Name: aiologger
|
||||
Version: 0.6.0
|
||||
Summary: Asynchronous logging for python and asyncio
|
||||
Home-page: https://github.com/b2wdigital/aiologger
|
||||
Author: Diogo Magalhães Martins
|
||||
Author-email: magalhaesmartins@icloud.com
|
||||
License: MIT
|
||||
Project-URL: Documentation, https://aiologger.readthedocs.io/en/latest/
|
||||
Project-URL: Code, https://github.com/b2wdigital/aiologger
|
||||
Project-URL: Issue tracker, https://github.com/b2wdigital/aiologger/issues
|
||||
Description: # aiologger
|
||||
|
||||
[](http://pypi.python.org/pypi/aiologger)
|
||||
[](http://pypi.python.org/pypi/aiologger)
|
||||
[](https://travis-ci.org/B2W-BIT/aiologger)
|
||||
[](https://codecov.io/gh/B2W-BIT/aiologger)
|
||||
|
||||
|
||||
# About the Project
|
||||
|
||||
The builtin python logger is I/O blocking. This means that using the builtin
|
||||
`logging` module will interfere with your asynchronouns application performance.
|
||||
`aiologger` aims to be the standard Asynchronous non blocking logging for
|
||||
python and asyncio.
|
||||
|
||||
# A word about async, Python and files
|
||||
|
||||
Tldr; `aiologger` is only fully async when logging to stdout/stderr. If you log into files on disk you are not being fully async and will be using Threads.
|
||||
|
||||
`aiologger` was created when we realized that there were no async logging libs to use. At the time, Python's built-in logging infra-structure was fully sync (still is, 3.8 beta is out). That's why we created aiologger.
|
||||
|
||||
Despite everything (in Linux) being a file descriptor, a Network file descriptor and the stdout/stderr FDs are treated differently from files on disk FDs. This happens because there's no stable/usable async I/O interface published by the OS to be used by Python (or any other language). That's why **logging to files is NOT truly async**. `aiologger` implementation of file logging uses [aiofiles](https://github.com/Tinche/aiofiles), which uses a Thread Pool to write the data. Keep this in mind when using `aiologger` for file logging.
|
||||
|
||||
Other than that, we hope `aiologger` helps you write fully async apps. :tada: :tada:
|
||||
|
||||
|
||||
# Installation
|
||||
|
||||
```
|
||||
pip install aiologger
|
||||
```
|
||||
|
||||
# Testing
|
||||
|
||||
```
|
||||
pipenv install --dev
|
||||
pipenv run test
|
||||
```
|
||||
|
||||
# Implemented interfaces
|
||||
|
||||
aiologger implements two different interfaces that you can use to generate your logs.
|
||||
You can generate your logs using the `async/await` syntax or, if you for any reason can't (or don't want to)
|
||||
change all your codebase to use this syntax you can use aiologger as if it were synchronous, but behind the scenes
|
||||
your logs will be generated asynchronously.
|
||||
|
||||
|
||||
# Migrating from standard lib logging
|
||||
|
||||
|
||||
## Using aiologger with the standard syntax
|
||||
|
||||
If you prefer not to use the `async/await` all you need to do is to replace you logger instance with an instance of `aiologger.Logger`.
|
||||
For now on you can call `logger.info()` the same way you are (probably) already calling. Here is a simple example:
|
||||
|
||||
```python
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
|
||||
from logging import getLogger
|
||||
|
||||
|
||||
async def main():
|
||||
logger = getLogger(__name__)
|
||||
logging.basicConfig(level=logging.DEBUG, format="%(message)s")
|
||||
|
||||
logger.debug("debug")
|
||||
logger.info("info")
|
||||
|
||||
logger.warning("warning")
|
||||
logger.error("error")
|
||||
logger.critical("critical")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
Which will output the following lines:
|
||||
|
||||
```
|
||||
debug
|
||||
info
|
||||
warning
|
||||
error
|
||||
critical
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
If you want to generate all your logs asynchronously, you just have to change the instance of the `logger` object.
|
||||
To do that, all we need to change those lines from:
|
||||
|
||||
```python
|
||||
from logging import getLogger
|
||||
|
||||
logger = getLogger(__name__)
|
||||
```
|
||||
|
||||
to:
|
||||
|
||||
```python
|
||||
from aiologger import Logger
|
||||
|
||||
logger = Logger.with_default_handlers()
|
||||
```
|
||||
|
||||
and here is the complete example, generating all log lines asynchronously.
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from aiologger import Logger
|
||||
|
||||
|
||||
async def main():
|
||||
logger = Logger.with_default_handlers(name='my-logger')
|
||||
|
||||
logger.debug("debug")
|
||||
logger.info("info")
|
||||
|
||||
logger.warning("warning")
|
||||
logger.error("error")
|
||||
logger.critical("critical")
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
This code will output the following lines:
|
||||
|
||||
```
|
||||
warning
|
||||
debug
|
||||
info
|
||||
error
|
||||
critical
|
||||
```
|
||||
|
||||
As you might have noticed, the output order **IS NOT GUARANTEED**.
|
||||
If some kind of order is important to you, you'll need to use the `await` syntax.
|
||||
But thinking about an asyncio application, where every I/O operation is asynchronous,
|
||||
this shouldn't really matter.
|
||||
|
||||
Also note that logger calls may only be made from an `async def` or from a
|
||||
function called with an `async def` somewhere in the callstack.
|
||||
|
||||
## Using aiologger with the async/await syntax
|
||||
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from aiologger import Logger
|
||||
|
||||
|
||||
async def main():
|
||||
logger = Logger.with_default_handlers(name='my-logger')
|
||||
|
||||
await logger.debug("debug at stdout")
|
||||
await logger.info("info at stdout")
|
||||
|
||||
await logger.warning("warning at stderr")
|
||||
await logger.error("error at stderr")
|
||||
await logger.critical("critical at stderr")
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
The most basic use case is to log the output into `stdout` and `stderr`.
|
||||
Using `Logger.with_default_handlers` you're able to effortlessly create a new
|
||||
`Logger` instance with 2 distinct handlers:
|
||||
* One for handling `debug` and `info` methods and writing to `stdout`;
|
||||
* The other, for handling `warning`, `critical`, `exception` and `error` methods and writing to `stderr`.
|
||||
|
||||
Since everything is asynchronous, this means that for the same handler,
|
||||
the output order is guaranteed, but not between distinct handlers.
|
||||
The above code may output the following:
|
||||
|
||||
```
|
||||
warning at stderr
|
||||
debug at stdout
|
||||
error at stderr
|
||||
info at stdout
|
||||
critical at stderr
|
||||
```
|
||||
|
||||
You may notice that the order between the same handler is guaranteed. E.g.:
|
||||
* `debug at stdout` was outputted before `info at stdout`
|
||||
* `warning at stderr` was outputted before `error at stderr`
|
||||
* between lines of distinct handlers, the order isn't guaranteed.
|
||||
`warning at stderr` was outputted before `debug at stdout`
|
||||
|
||||
## Lazy initialization
|
||||
|
||||
Since the actual stream initialization only happens on the first log call, it's
|
||||
possible to initialize `aiologger.Logger` instances outside a running event
|
||||
loop:
|
||||
|
||||
|
||||
```python
|
||||
|
||||
import asyncio
|
||||
from aiologger import Logger
|
||||
|
||||
|
||||
logger = Logger.with_default_handlers(name='my-logger')
|
||||
|
||||
|
||||
async def main():
|
||||
|
||||
await logger.debug("debug at stdout")
|
||||
await logger.info("info at stdout")
|
||||
|
||||
await logger.warning("warning at stderr")
|
||||
await logger.error("error at stderr")
|
||||
await logger.critical("critical at stderr")
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
# Loggers
|
||||
|
||||
## JsonLogger
|
||||
|
||||
A simple, featureful, drop-in replacement to the default `aiologger.Logger`
|
||||
that grants to always log valid, single line, JSON output.
|
||||
|
||||
### It logs everything
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from datetime import datetime
|
||||
|
||||
from aiologger.loggers.json import JsonLogger
|
||||
|
||||
|
||||
async def main():
|
||||
logger = JsonLogger.with_default_handlers()
|
||||
await logger.info("Im a string")
|
||||
# {"logged_at": "2018-06-14T09:34:56.482817", "line_number": 9, "function": "main", "level": "INFO", "file_path": "/Users/diogo.mmartins/Library/Preferences/PyCharm2018.1/scratches/scratch_47.py", "msg": "Im a string"}
|
||||
|
||||
await logger.info({
|
||||
'date_objects': datetime.now(),
|
||||
'exceptions': KeyError("Boooom"),
|
||||
'types': JsonLogger
|
||||
})
|
||||
# {"logged_at": "2018-06-14T09:34:56.483000", "line_number": 13, "function": "main", "level": "INFO", "file_path": "/Users/diogo.mmartins/Library/Preferences/PyCharm2018.1/scratches/scratch_47.py", "msg": {"date_objects": "2018-06-14T09:34:56.482953", "exceptions": "Exception: KeyError('Boooom',)", "types": "<JsonLogger aiologger-json (DEBUG)>"}}
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
### JsonLogger Options
|
||||
|
||||
`Callable[[], str]` log values may also be used to generate dynamic content that
|
||||
are evaluated at serialization time. All you need to do is wrap the callable
|
||||
using `CallableWrapper`:
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import logging
|
||||
from random import randint
|
||||
|
||||
from aiologger.loggers.json import JsonLogger
|
||||
from aiologger.utils import CallableWrapper
|
||||
|
||||
|
||||
def rand():
|
||||
return randint(1, 100)
|
||||
|
||||
|
||||
logger = JsonLogger.with_default_handlers(level=logging.DEBUG)
|
||||
|
||||
|
||||
async def main():
|
||||
|
||||
await logger.info(CallableWrapper(rand))
|
||||
# {"logged_at": "2018-06-14T09:37:52.624123", "line_number": 15, "function": "main", "level": "INFO", "file_path": "/Users/diogo.mmartins/Library/Preferences/PyCharm2018.1/scratches/scratch_47.py", "msg": 70}
|
||||
|
||||
await logger.info({"Xablau": CallableWrapper(rand)})
|
||||
# {"logged_at": "2018-06-14T09:37:52.624305", "line_number": 18, "function": "main", "level": "INFO", "file_path": "/Users/diogo.mmartins/Library/Preferences/PyCharm2018.1/scratches/scratch_47.py", "msg": {"Xablau": 29}}
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
### Adding content to root
|
||||
|
||||
By default, everything passed to the log methods is inserted inside
|
||||
the `msg` root attribute, but sometimes we want to add content to the root level.
|
||||
|
||||
#### Flatten
|
||||
|
||||
This behavior may be achieved using `flatten`. Which is
|
||||
available both as a method parameter and instance attribute.
|
||||
|
||||
As an instance attribute, every call to a log method would "flat" the dict attributes.
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import logging
|
||||
from aiologger.loggers.json import JsonLogger
|
||||
|
||||
|
||||
async def main():
|
||||
logger = JsonLogger.with_default_handlers(level=logging.DEBUG, flatten=True)
|
||||
|
||||
await logger.info({"status_code": 200, "response_time": 0.00534534})
|
||||
# {"status_code": 200, "response_time": 0.534534, "logged_at": "2017-08-11T16:18:58.446985", "line_number": 6, "function": "<module>", "level": "INFO", "path": "/Users/diogo/PycharmProjects/aiologger/bla.py"}
|
||||
|
||||
await logger.error({"status_code": 404, "response_time": 0.00134534})
|
||||
# {"status_code": 200, "response_time": 0.534534, "logged_at": "2017-08-11T16:18:58.446986", "line_number": 6, "function": "<module>", "level": "INFO", "path": "/Users/diogo/PycharmProjects/aiologger/bla.py"}
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
As a method parameter, only the specific call would add the content to the root.
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import logging
|
||||
from aiologger.loggers.json import JsonLogger
|
||||
|
||||
|
||||
async def main():
|
||||
logger = await JsonLogger.with_default_handlers(level=logging.DEBUG)
|
||||
|
||||
await logger.info({"status_code": 200, "response_time": 0.00534534}, flatten=True)
|
||||
# {"logged_at": "2017-08-11T16:23:16.312441", "line_number": 6, "function": "<module>", "level": "INFO", "path": "/Users/diogo/PycharmProjects/aiologger/bla.py", "status_code": 200, "response_time": 0.00534534}
|
||||
|
||||
await logger.error({"status_code": 404, "response_time": 0.00134534})
|
||||
# {"logged_at": "2017-08-11T16:23:16.312618", "line_number": 8, "function": "<module>", "level": "ERROR", "path": "/Users/diogo/PycharmProjects/aiologger/bla.py", "msg": {"status_code": 404, "response_time": 0.00134534}}
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
**Warning**: It is possible to overwrite keys that are already present at root level.
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import logging
|
||||
from aiologger.loggers.json import JsonLogger
|
||||
|
||||
|
||||
async def main():
|
||||
logger = JsonLogger.with_default_handlers(level=logging.DEBUG)
|
||||
|
||||
await logger.info({'logged_at': 'Yesterday'}, flatten=True)
|
||||
# {"logged_at": "Yesterday", "line_number": 6, "function": "<module>", "level": "INFO", "path": "/Users/diogo/PycharmProjects/aiologger/bla.py"}
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
#### Extra
|
||||
|
||||
The `extra` parameter allow you to add specific content to root:
|
||||
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import logging
|
||||
from aiologger.loggers.json import JsonLogger
|
||||
|
||||
|
||||
async def main():
|
||||
a = 69
|
||||
b = 666
|
||||
c = [a, b]
|
||||
logger = JsonLogger.with_default_handlers(level=logging.DEBUG)
|
||||
|
||||
await logger.info("I'm a simple log")
|
||||
# {"msg": "I'm a simple log", "logged_at": "2017-08-11T12:21:05.722216", "line_number": 5, "function": "<module>", "level": "INFO", "path": "/Users/diogo/PycharmProjects/aiologger/bla.py"}
|
||||
|
||||
await logger.info({"dog": "Xablau"}, extra=locals())
|
||||
# {"logged_at": "2018-06-14T09:47:29.477705", "line_number": 14, "function": "main", "level": "INFO", "file_path": "/Users/diogo.mmartins/Library/Preferences/PyCharm2018.1/scratches/scratch_47.py", "msg": {"dog": "Xablau"}, "logger": "<JsonLogger aiologger-json (DEBUG)>", "c": [69, 666], "b": 666, "a": 69}
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
It also allows you to override the default root content:
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import logging
|
||||
from aiologger.loggers.json import JsonLogger
|
||||
|
||||
|
||||
async def main():
|
||||
logger = JsonLogger.with_default_handlers(level=logging.DEBUG)
|
||||
|
||||
await logger.info("I'm a simple log")
|
||||
# {"msg": "I'm a simple log", "logged_at": "2017-08-11T12:21:05.722216", "line_number": 6, "function": "<module>", "level": "INFO", "path": "/Users/diogo/PycharmProjects/aiologger/bla.py"}
|
||||
|
||||
await logger.info("I'm a simple log", extra={'logged_at': 'Yesterday'})
|
||||
# {"msg": "I'm a simple log", "logged_at": "Yesterday", "line_number": 6, "function": "<module>", "level": "INFO", "path": "/Users/diogo/PycharmProjects/aiologger/bla.py"}
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
and it may also be used as an instance attribute:
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import logging
|
||||
from aiologger.loggers.json import JsonLogger
|
||||
|
||||
|
||||
async def main():
|
||||
logger = JsonLogger.with_default_handlers(level=logging.DEBUG, extra={'logged_at': 'Yesterday'})
|
||||
|
||||
await logger.info("I'm a simple log")
|
||||
# {"msg": "I'm a simple log", "logged_at": "Yesterday", "line_number": 6, "function": "<module>", "level": "INFO", "path": "/Users/diogo/PycharmProjects/aiologger/bla.py"}
|
||||
|
||||
await logger.info("I'm a simple log")
|
||||
# {"msg": "I'm a simple log", "logged_at": "Yesterday", "line_number": 6, "function": "<module>", "level": "INFO", "path": "/Users/diogo/PycharmProjects/aiologger/bla.py"}
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
|
||||
#### Exclude default logger fields
|
||||
|
||||
If you think that the default fields are too much, it's also possible to
|
||||
exclude fields from the output message.
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import logging
|
||||
from aiologger.loggers.json import JsonLogger
|
||||
from aiologger.formatters.json import FUNCTION_NAME_FIELDNAME, LOGGED_AT_FIELDNAME
|
||||
|
||||
|
||||
async def main():
|
||||
logger = JsonLogger.with_default_handlers(
|
||||
level=logging.DEBUG,
|
||||
exclude_fields=[FUNCTION_NAME_FIELDNAME,
|
||||
LOGGED_AT_FIELDNAME,
|
||||
'file_path',
|
||||
'line_number']
|
||||
)
|
||||
|
||||
await logger.info("Function, file path and line number wont be printed")
|
||||
# {"level": "INFO", "msg": "Function, file path and line number wont be printed"}
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
### Serializer options
|
||||
|
||||
`serializer_kwargs` is available both as instance attribute and as
|
||||
a log method parameter and may be used to pass keyword arguments to the
|
||||
`serializer` function. (See more: https://docs.python.org/3/library/json.html)
|
||||
|
||||
For pretty printing the output, you may use the `indent` kwarg. Ex.:
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
import logging
|
||||
from aiologger.loggers.json import JsonLogger
|
||||
|
||||
|
||||
async def main():
|
||||
logger = JsonLogger.with_default_handlers(
|
||||
level=logging.DEBUG,
|
||||
serializer_kwargs={'indent': 4}
|
||||
)
|
||||
|
||||
await logger.info({
|
||||
"artist": "Black Country Communion",
|
||||
"song": "Cold"
|
||||
})
|
||||
|
||||
await logger.shutdown()
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
```
|
||||
|
||||
Would result in a pretty indented output:
|
||||
|
||||
```javascript
|
||||
{
|
||||
"logged_at": "2017-08-11T21:04:21.559070",
|
||||
"line_number": 5,
|
||||
"function": "<module>",
|
||||
"level": "INFO",
|
||||
"file_path": "/Users/diogo/Library/Preferences/PyCharm2017.1/scratches/scratch_32.py",
|
||||
"msg": {
|
||||
"artist": "Black Country Communion",
|
||||
"song": "Cold"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
The same result can be achieved making a log call with `serializer_kwargs`
|
||||
as a parameter.
|
||||
|
||||
|
||||
```python
|
||||
await logger.warning({'artist': 'Black Country Communion', 'song': 'Cold'}, serializer_kwargs={'indent': 4})
|
||||
```
|
||||
|
||||
# Handlers
|
||||
|
||||
## AsyncStreamHandler
|
||||
|
||||
A handler class for writing logs into a stream which may be `sys.stdout`
|
||||
or `sys.stderr`. If a stream isn't provided, it defaults to `sys.stderr`. If
|
||||
`level` is not specified, `logging.NOTSET` is used. If `formatter` is not
|
||||
`None`, it is used to format the log record before `emit()` gets called. A
|
||||
`filter` may be used to filter log records
|
||||
|
||||
|
||||
```python
|
||||
import sys
|
||||
from aiologger.handlers.streams import AsyncStreamHandler
|
||||
|
||||
|
||||
handler = AsyncStreamHandler(stream=sys.stdout)
|
||||
```
|
||||
It also accepts a level, formatter and filter at the initialization.
|
||||
|
||||
## AsyncFileHandler
|
||||
|
||||
**Important**: AsyncFileHandler depends on a optional dependency and you should
|
||||
install aiologger with `pip install aiologger[aiofiles]`
|
||||
|
||||
A handler class that sends logs into files. The specified file is opened
|
||||
and used as the _stream_ for logging. If `mode` is not specified, 'a' is
|
||||
used. If `encoding` is not `None`, it is used to open the file with that
|
||||
encoding. The file opening is delayed until the first call to `emit()`.
|
||||
|
||||
```python
|
||||
from aiologger.handlers.files import AsyncFileHandler
|
||||
from tempfile import NamedTemporaryFile
|
||||
|
||||
|
||||
temp_file = NamedTemporaryFile()
|
||||
handler = AsyncFileHandler(filename=temp_file.name)
|
||||
```
|
||||
# Options
|
||||
|
||||
* `AIOLOGGER_HANDLE_ERROR_FALLBACK_ENABLED` - An environment variable that tells
|
||||
aiologger whether it should emit a log to `stderr` in case of a handler emit
|
||||
raises an exceptions. To disable the default behaviour, set this
|
||||
environment variable to a falsy value `("False", "false", "0")`. Default: `True`
|
||||
|
||||
# Compatibility
|
||||
|
||||
The explicit passing of a `loop` keyword argument, and subsequent access of a
|
||||
`.loop` attribute, has been deprecated and will be removed in version 0.7.0 for
|
||||
Loggers and Handlers.
|
||||
|
||||
Currently tested only on python 3.6 and 3.7
|
||||
|
||||
# Depencencies
|
||||
|
||||
Has none.
|
||||
|
||||
Keywords: logging json log output
|
||||
Platform: UNKNOWN
|
||||
Classifier: Development Status :: 4 - Beta
|
||||
Classifier: Framework :: AsyncIO
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: License :: OSI Approved :: MIT License
|
||||
Classifier: Intended Audience :: System Administrators
|
||||
Classifier: Intended Audience :: Information Technology
|
||||
Classifier: Natural Language :: English
|
||||
Classifier: Operating System :: MacOS :: MacOS X
|
||||
Classifier: Operating System :: Unix
|
||||
Classifier: Programming Language :: Python :: 3.6
|
||||
Classifier: Programming Language :: Python :: 3.7
|
||||
Classifier: Topic :: System :: Logging
|
||||
Classifier: Topic :: Software Development :: Libraries
|
||||
Requires-Python: >=3.6
|
||||
Description-Content-Type: text/markdown
|
||||
Provides-Extra: aiofiles
|
||||
@@ -0,0 +1,25 @@
|
||||
README.md
|
||||
setup.cfg
|
||||
setup.py
|
||||
aiologger/__init__.py
|
||||
aiologger/filters.py
|
||||
aiologger/levels.py
|
||||
aiologger/logger.py
|
||||
aiologger/protocols.py
|
||||
aiologger/records.py
|
||||
aiologger/settings.py
|
||||
aiologger/utils.py
|
||||
aiologger.egg-info/PKG-INFO
|
||||
aiologger.egg-info/SOURCES.txt
|
||||
aiologger.egg-info/dependency_links.txt
|
||||
aiologger.egg-info/requires.txt
|
||||
aiologger.egg-info/top_level.txt
|
||||
aiologger/formatters/__init__.py
|
||||
aiologger/formatters/base.py
|
||||
aiologger/formatters/json.py
|
||||
aiologger/handlers/__init__.py
|
||||
aiologger/handlers/base.py
|
||||
aiologger/handlers/files.py
|
||||
aiologger/handlers/streams.py
|
||||
aiologger/loggers/__init__.py
|
||||
aiologger/loggers/json.py
|
||||
@@ -0,0 +1 @@
|
||||
|
||||
@@ -0,0 +1,39 @@
|
||||
../aiologger/__init__.py
|
||||
../aiologger/__pycache__/__init__.cpython-38.pyc
|
||||
../aiologger/__pycache__/filters.cpython-38.pyc
|
||||
../aiologger/__pycache__/levels.cpython-38.pyc
|
||||
../aiologger/__pycache__/logger.cpython-38.pyc
|
||||
../aiologger/__pycache__/protocols.cpython-38.pyc
|
||||
../aiologger/__pycache__/records.cpython-38.pyc
|
||||
../aiologger/__pycache__/settings.cpython-38.pyc
|
||||
../aiologger/__pycache__/utils.cpython-38.pyc
|
||||
../aiologger/filters.py
|
||||
../aiologger/formatters/__init__.py
|
||||
../aiologger/formatters/__pycache__/__init__.cpython-38.pyc
|
||||
../aiologger/formatters/__pycache__/base.cpython-38.pyc
|
||||
../aiologger/formatters/__pycache__/json.cpython-38.pyc
|
||||
../aiologger/formatters/base.py
|
||||
../aiologger/formatters/json.py
|
||||
../aiologger/handlers/__init__.py
|
||||
../aiologger/handlers/__pycache__/__init__.cpython-38.pyc
|
||||
../aiologger/handlers/__pycache__/base.cpython-38.pyc
|
||||
../aiologger/handlers/__pycache__/files.cpython-38.pyc
|
||||
../aiologger/handlers/__pycache__/streams.cpython-38.pyc
|
||||
../aiologger/handlers/base.py
|
||||
../aiologger/handlers/files.py
|
||||
../aiologger/handlers/streams.py
|
||||
../aiologger/levels.py
|
||||
../aiologger/logger.py
|
||||
../aiologger/loggers/__init__.py
|
||||
../aiologger/loggers/__pycache__/__init__.cpython-38.pyc
|
||||
../aiologger/loggers/__pycache__/json.cpython-38.pyc
|
||||
../aiologger/loggers/json.py
|
||||
../aiologger/protocols.py
|
||||
../aiologger/records.py
|
||||
../aiologger/settings.py
|
||||
../aiologger/utils.py
|
||||
PKG-INFO
|
||||
SOURCES.txt
|
||||
dependency_links.txt
|
||||
requires.txt
|
||||
top_level.txt
|
||||
@@ -0,0 +1,3 @@
|
||||
|
||||
[aiofiles]
|
||||
aiofiles==0.4.0
|
||||
@@ -0,0 +1 @@
|
||||
aiologger
|
||||
@@ -0,0 +1 @@
|
||||
from .logger import Logger
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,98 @@
|
||||
# The following code and documentation was inspired, and in some cases
|
||||
# copied and modified, from the work of Vinay Sajip and contributors
|
||||
# on cpython's logging package
|
||||
from abc import ABC
|
||||
from typing import List, Callable, Union
|
||||
|
||||
from aiologger.levels import LogLevel
|
||||
from aiologger.records import LogRecord
|
||||
|
||||
|
||||
class Filter:
|
||||
"""
|
||||
Filter instances are used to perform arbitrary filtering of LogRecords.
|
||||
|
||||
Loggers and Handlers can optionally use Filter instances to filter
|
||||
records as desired. The base filter class only allows events which are
|
||||
below a certain point in the logger hierarchy. For example, a filter
|
||||
initialized with "A.B" will allow events logged by loggers "A.B",
|
||||
"A.B.C", "A.B.C.D", "A.B.D" etc. but not "A.BB", "B.A.B" etc. If
|
||||
initialized with the empty string, all events are passed.
|
||||
"""
|
||||
|
||||
def __init__(self, name: str = "") -> None:
|
||||
"""
|
||||
Initialize a filter.
|
||||
|
||||
Initialize with the name of the logger which, together with its
|
||||
children, will have its events allowed through the filter. If no
|
||||
name is specified, allow every event.
|
||||
"""
|
||||
self.name = name
|
||||
self.name_length = len(name)
|
||||
|
||||
def filter(self, record: LogRecord) -> bool:
|
||||
"""
|
||||
Determine if the specified record is to be logged.
|
||||
"""
|
||||
if self.name_length == 0:
|
||||
return True
|
||||
elif self.name == record.name:
|
||||
return True
|
||||
elif not record.name.startswith(self.name):
|
||||
return False
|
||||
return record.name[self.name_length] == "."
|
||||
|
||||
def __call__(self, record: LogRecord) -> bool:
|
||||
return self.filter(record)
|
||||
|
||||
|
||||
_FilterCallable = Callable[[LogRecord], bool]
|
||||
|
||||
|
||||
class Filterer(ABC):
|
||||
"""
|
||||
A base class for loggers and handlers which allows them to share
|
||||
common code.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""
|
||||
Initialize the list of filters to be an empty list.
|
||||
"""
|
||||
self.filters: List[Union[Filter, _FilterCallable]] = []
|
||||
|
||||
def add_filter(self, filter: Filter):
|
||||
"""
|
||||
Add the specified filter to this handler.
|
||||
"""
|
||||
if not (filter in self.filters):
|
||||
self.filters.append(filter)
|
||||
|
||||
def remove_filter(self, filter: Filter):
|
||||
"""
|
||||
Remove the specified filter from this handler.
|
||||
"""
|
||||
if filter in self.filters:
|
||||
self.filters.remove(filter)
|
||||
|
||||
def filter(self, record: LogRecord) -> bool:
|
||||
"""
|
||||
Determine if a record is loggable by consulting all the filters.
|
||||
|
||||
The default is to allow the record to be logged; any filter can veto
|
||||
this and the record is then dropped. Returns a zero value if a record
|
||||
is to be dropped, else non-zero.
|
||||
"""
|
||||
for filter in self.filters:
|
||||
result = filter(record)
|
||||
if not result:
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
class StdoutFilter(Filter):
|
||||
_levels = (LogLevel.DEBUG, LogLevel.INFO)
|
||||
|
||||
def filter(self, record: LogRecord) -> bool:
|
||||
return record.levelno in self._levels
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,238 @@
|
||||
import enum
|
||||
import io
|
||||
import time
|
||||
import traceback
|
||||
from string import Template
|
||||
from typing import Union, List
|
||||
from types import TracebackType
|
||||
|
||||
from aiologger.records import LogRecord, ExceptionInfo
|
||||
|
||||
|
||||
class FormatStyles(str, enum.Enum):
|
||||
PERCENT = "%"
|
||||
STRING_TEMPLATE = "$"
|
||||
STRING_FORMAT = "{"
|
||||
|
||||
|
||||
class PercentStyle:
|
||||
default_format = "%(message)s"
|
||||
asctime_format = "%(asctime)s"
|
||||
asctime_search = "%(asctime)"
|
||||
|
||||
def __init__(self, fmt: str = None) -> None:
|
||||
self._fmt = fmt or self.default_format
|
||||
self.uses_time = self._fmt.find(self.asctime_search) >= 0
|
||||
|
||||
def format(self, record: LogRecord) -> str:
|
||||
return self._fmt % record.__dict__
|
||||
|
||||
|
||||
class StrFormatStyle(PercentStyle):
|
||||
default_format = "{message}"
|
||||
asctime_format = "{asctime}"
|
||||
asctime_search = "{asctime"
|
||||
|
||||
def format(self, record: LogRecord) -> str:
|
||||
return self._fmt.format(**record.__dict__)
|
||||
|
||||
|
||||
class StringTemplateStyle(PercentStyle):
|
||||
default_format = "${message}"
|
||||
asctime_format = "${asctime}"
|
||||
asctime_search = "${asctime}"
|
||||
|
||||
def __init__(self, fmt: str = None) -> None:
|
||||
self._fmt = fmt or self.default_format
|
||||
self._template = Template(self._fmt)
|
||||
self.uses_time = (
|
||||
self._fmt.find("$asctime") >= 0
|
||||
or self._fmt.find(self.asctime_format) >= 0
|
||||
)
|
||||
|
||||
def format(self, record: LogRecord) -> str:
|
||||
return self._template.substitute(**record.__dict__)
|
||||
|
||||
|
||||
BASIC_FORMAT = "%(levelname)s:%(name)s:%(message)s"
|
||||
|
||||
_STYLES = {
|
||||
"%": (PercentStyle, BASIC_FORMAT),
|
||||
"{": (StrFormatStyle, "{levelname}:{name}:{message}"),
|
||||
"$": (StringTemplateStyle, "${levelname}:${name}:${message}"),
|
||||
}
|
||||
|
||||
|
||||
class Formatter:
|
||||
"""
|
||||
Formatter instances are used to convert a ExtendedLogRecord to text.
|
||||
|
||||
Formatters need to know how a ExtendedLogRecord is constructed. They are
|
||||
responsible for converting a ExtendedLogRecord to (usually) a string which can
|
||||
be interpreted by either a human or an external system. The base Formatter
|
||||
allows a formatting string to be specified. If none is supplied, the
|
||||
default value of "%s(message)" is used.
|
||||
|
||||
The Formatter can be initialized with a format string which makes use of
|
||||
knowledge of the ExtendedLogRecord attributes - e.g. the default value mentioned
|
||||
above makes use of the fact that the user's message and arguments are pre-
|
||||
formatted into a ExtendedLogRecord's message attribute. Currently, the useful
|
||||
attributes in a ExtendedLogRecord are described by:
|
||||
|
||||
%(name)s Name of the logger (logging channel)
|
||||
%(levelno)s Numeric logging level for the message (DEBUG, INFO,
|
||||
WARNING, ERROR, CRITICAL)
|
||||
%(levelname)s Text logging level for the message ("DEBUG", "INFO",
|
||||
"WARNING", "ERROR", "CRITICAL")
|
||||
%(pathname)s Full pathname of the source file where the logging
|
||||
call was issued (if available)
|
||||
%(filename)s Filename portion of pathname
|
||||
%(module)s Module (name portion of filename)
|
||||
%(lineno)d Source line number where the logging call was issued
|
||||
(if available)
|
||||
%(funcName)s Function name
|
||||
%(created)f Time when the ExtendedLogRecord was created (time.time()
|
||||
return value)
|
||||
%(asctime)s Textual time when the ExtendedLogRecord was created
|
||||
%(msecs)d Millisecond portion of the creation time
|
||||
%(relativeCreated)d Time in milliseconds when the ExtendedLogRecord was created,
|
||||
relative to the time the logging module was loaded
|
||||
(typically at application startup time)
|
||||
%(thread)d Thread ID (if available)
|
||||
%(threadName)s Thread name (if available)
|
||||
%(process)d Process ID (if available)
|
||||
%(message)s The result of record.get_message(), computed just as
|
||||
the record is emitted
|
||||
"""
|
||||
|
||||
default_time_format = "%Y-%m-%d %H:%M:%S"
|
||||
default_msec_format = "%s,%03d"
|
||||
terminator = "\n"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
fmt: str = None,
|
||||
datefmt: str = None,
|
||||
style: Union[str, FormatStyles] = "%",
|
||||
) -> None:
|
||||
"""
|
||||
Initialize the formatter with specified format strings.
|
||||
|
||||
Initialize the formatter either with the specified format string, or a
|
||||
default as described above. Allow for specialized date formatting with
|
||||
the optional datefmt argument. If datefmt is omitted, you get an
|
||||
ISO8601-like (or RFC 3339-like) format.
|
||||
|
||||
Use a style parameter of '%', '{' or '$' to specify that you want to
|
||||
use one of %-formatting, :meth:`str.format` (``{}``) formatting or
|
||||
:class:`string.Template` formatting in your format string.
|
||||
|
||||
.. versionchanged:: 3.2
|
||||
Added the ``style`` parameter.
|
||||
"""
|
||||
if style not in _STYLES:
|
||||
valid_styles = ",".join(_STYLES.keys())
|
||||
raise ValueError(f"Style must be one of: {valid_styles}")
|
||||
|
||||
self._style = _STYLES[style][0](fmt)
|
||||
self._fmt = self._style._fmt
|
||||
self.datefmt = datefmt
|
||||
self.converter = time.localtime
|
||||
|
||||
def format_time(self, record: LogRecord, datefmt: str = None) -> str:
|
||||
"""
|
||||
Return the creation time of the specified ExtendedLogRecord as formatted text.
|
||||
|
||||
This method should be called from format() by a formatter which
|
||||
wants to make use of a formatted time. This method can be overridden
|
||||
in formatters to provide for any specific requirement, but the
|
||||
basic behaviour is as follows: if datefmt (a string) is specified,
|
||||
it is used with time.strftime() to format the creation time of the
|
||||
record. Otherwise, an ISO8601-like (or RFC 3339-like) format is used.
|
||||
The resulting string is returned. This function uses a user-configurable
|
||||
function to convert the creation time to a tuple. By default,
|
||||
time.localtime() is used; to change this for a particular formatter
|
||||
instance, set the 'converter' attribute to a function with the same
|
||||
signature as time.localtime() or time.gmtime(). To change it for all
|
||||
formatters, for example if you want all logging times to be shown in GMT,
|
||||
set the 'converter' attribute in the Formatter class.
|
||||
"""
|
||||
ct = self.converter(record.created)
|
||||
if datefmt:
|
||||
return time.strftime(datefmt, ct)
|
||||
else:
|
||||
t = time.strftime(self.default_time_format, ct)
|
||||
return self.default_msec_format % (t, record.msecs)
|
||||
|
||||
def format_exception(self, exception_info: ExceptionInfo) -> str:
|
||||
"""
|
||||
Format and return the specified exception information as a string.
|
||||
|
||||
This default implementation just uses
|
||||
traceback.print_exception()
|
||||
"""
|
||||
string_io = io.StringIO()
|
||||
tb = exception_info[2]
|
||||
|
||||
traceback.print_exception(
|
||||
exception_info[0], exception_info[1], tb, None, string_io
|
||||
)
|
||||
|
||||
s = string_io.getvalue()
|
||||
string_io.close()
|
||||
if s[-1:] == self.terminator:
|
||||
s = s[:-1]
|
||||
return s
|
||||
|
||||
def format_message(self, record: LogRecord) -> str:
|
||||
return self._style.format(record)
|
||||
|
||||
def format_stack(self, stack_info):
|
||||
"""
|
||||
This method is provided as an extension point for specialized
|
||||
formatting of stack information.
|
||||
|
||||
The input data is a string as returned from a call to
|
||||
:func:`traceback.print_stack`, but with the last trailing newline
|
||||
removed.
|
||||
|
||||
The base implementation just returns the value passed in.
|
||||
"""
|
||||
return stack_info
|
||||
|
||||
@staticmethod
|
||||
def format_traceback(tb: TracebackType) -> List[str]:
|
||||
formatted_tb = "".join(traceback.format_tb(tb))
|
||||
return formatted_tb.strip().split("\n")
|
||||
|
||||
def format(self, record: LogRecord) -> str:
|
||||
"""
|
||||
Format the specified record as text.
|
||||
|
||||
The record's attribute dictionary is used as the operand to a
|
||||
string formatting operation which yields the returned string.
|
||||
Before formatting the dictionary, a couple of preparatory steps
|
||||
are carried out. The message attribute of the record is computed
|
||||
using LogRecord.get_message(). If the formatting string uses the
|
||||
time (as determined by a call to usesTime(), format_time() is
|
||||
called to format the event time. If there is exception information,
|
||||
it is formatted using format_exception() and appended to the message.
|
||||
"""
|
||||
record.message = record.get_message()
|
||||
if self._style.uses_time:
|
||||
record.asctime = self.format_time(record, self.datefmt)
|
||||
s = self.format_message(record)
|
||||
if record.exc_info:
|
||||
# Cache the traceback text to avoid converting it multiple times
|
||||
# (it's constant anyway)
|
||||
if not record.exc_text:
|
||||
record.exc_text = self.format_exception(record.exc_info)
|
||||
if record.exc_text:
|
||||
if s[-1:] != self.terminator:
|
||||
s = s + self.terminator
|
||||
s = s + record.exc_text
|
||||
if record.stack_info:
|
||||
if s[-1:] != self.terminator:
|
||||
s = s + self.terminator
|
||||
s = s + self.format_stack(record.stack_info)
|
||||
return s
|
||||
@@ -0,0 +1,155 @@
|
||||
import json
|
||||
import traceback
|
||||
from datetime import datetime
|
||||
from inspect import istraceback
|
||||
from typing import Callable, Iterable, Union, Dict, Optional, List
|
||||
from datetime import timezone
|
||||
|
||||
from aiologger.formatters.base import Formatter
|
||||
from aiologger.levels import LEVEL_TO_NAME
|
||||
from aiologger.records import LogRecord
|
||||
from aiologger.utils import CallableWrapper
|
||||
|
||||
|
||||
LOGGED_AT_FIELDNAME = "logged_at"
|
||||
LINE_NUMBER_FIELDNAME = "line_number"
|
||||
FUNCTION_NAME_FIELDNAME = "function"
|
||||
LOG_LEVEL_FIELDNAME = "level"
|
||||
MSG_FIELDNAME = "msg"
|
||||
FILE_PATH_FIELDNAME = "file_path"
|
||||
|
||||
|
||||
class JsonFormatter(Formatter):
|
||||
def __init__(
|
||||
self,
|
||||
serializer: Callable[..., str] = json.dumps,
|
||||
default_msg_fieldname: str = None,
|
||||
) -> None:
|
||||
super(JsonFormatter, self).__init__()
|
||||
self.serializer = serializer
|
||||
self.default_msg_fieldname = default_msg_fieldname or MSG_FIELDNAME
|
||||
|
||||
def _default_handler(self, obj):
|
||||
if isinstance(obj, datetime):
|
||||
return obj.isoformat()
|
||||
elif istraceback(obj):
|
||||
tb = "".join(traceback.format_tb(obj))
|
||||
return tb.strip().split("\n")
|
||||
elif isinstance(obj, Exception):
|
||||
return "Exception: %s" % repr(obj)
|
||||
elif type(obj) is type:
|
||||
return str(obj)
|
||||
elif isinstance(obj, CallableWrapper):
|
||||
return obj()
|
||||
return str(obj)
|
||||
|
||||
def format(self, record: LogRecord) -> str:
|
||||
"""
|
||||
Formats a record and serializes it as a JSON str. If record message isnt
|
||||
already a dict, initializes a new dict and uses `default_msg_fieldname`
|
||||
as a key as the record msg as the value.
|
||||
"""
|
||||
msg: Union[str, dict] = record.msg
|
||||
if not isinstance(msg, dict):
|
||||
msg = {self.default_msg_fieldname: msg}
|
||||
|
||||
if record.exc_info:
|
||||
msg["exc_info"] = record.exc_info
|
||||
if record.exc_text:
|
||||
msg["exc_text"] = record.exc_text
|
||||
|
||||
return self.serializer(msg, default=self._default_handler)
|
||||
|
||||
@classmethod
|
||||
def format_error_msg(cls, record: LogRecord, exception: Exception) -> Dict:
|
||||
traceback_info: Optional[List[str]]
|
||||
if exception.__traceback__:
|
||||
traceback_info = cls.format_traceback(exception.__traceback__)
|
||||
else:
|
||||
traceback_info = None
|
||||
return {
|
||||
"record": {
|
||||
LINE_NUMBER_FIELDNAME: record.lineno,
|
||||
LOG_LEVEL_FIELDNAME: record.levelname,
|
||||
FILE_PATH_FIELDNAME: record.filename,
|
||||
FUNCTION_NAME_FIELDNAME: record.funcName,
|
||||
MSG_FIELDNAME: str(record.msg),
|
||||
},
|
||||
LOGGED_AT_FIELDNAME: datetime.utcnow().isoformat(),
|
||||
"logger_exception": {
|
||||
"type": str(type(exception)),
|
||||
"exc": str(exception),
|
||||
"traceback": traceback_info,
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
class ExtendedJsonFormatter(JsonFormatter):
|
||||
level_to_name_mapping = LEVEL_TO_NAME
|
||||
default_fields = frozenset(
|
||||
[
|
||||
LOG_LEVEL_FIELDNAME,
|
||||
LOGGED_AT_FIELDNAME,
|
||||
LINE_NUMBER_FIELDNAME,
|
||||
FUNCTION_NAME_FIELDNAME,
|
||||
FILE_PATH_FIELDNAME,
|
||||
]
|
||||
)
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
serializer: Callable[..., str] = json.dumps,
|
||||
default_msg_fieldname: str = None,
|
||||
exclude_fields: Iterable[str] = None,
|
||||
tz: timezone = None,
|
||||
) -> None:
|
||||
|
||||
super(ExtendedJsonFormatter, self).__init__(
|
||||
serializer=serializer, default_msg_fieldname=default_msg_fieldname
|
||||
)
|
||||
self.tz = tz
|
||||
if exclude_fields is None:
|
||||
self.log_fields = self.default_fields
|
||||
else:
|
||||
self.log_fields = self.default_fields - set(exclude_fields)
|
||||
|
||||
def formatter_fields_for_record(self, record: LogRecord):
|
||||
"""
|
||||
:type record: aiologger.records.ExtendedLogRecord
|
||||
"""
|
||||
datetime_serialized = (
|
||||
datetime.now(timezone.utc).astimezone(self.tz).isoformat()
|
||||
)
|
||||
|
||||
default_fields = (
|
||||
(LOGGED_AT_FIELDNAME, datetime_serialized),
|
||||
(LINE_NUMBER_FIELDNAME, record.lineno),
|
||||
(FUNCTION_NAME_FIELDNAME, record.funcName),
|
||||
(LOG_LEVEL_FIELDNAME, self.level_to_name_mapping[record.levelno]),
|
||||
(FILE_PATH_FIELDNAME, record.pathname),
|
||||
)
|
||||
|
||||
for field, value in default_fields:
|
||||
if field in self.log_fields:
|
||||
yield field, value
|
||||
|
||||
def format(self, record) -> str:
|
||||
"""
|
||||
:type record: aiologger.records.ExtendedLogRecord
|
||||
"""
|
||||
msg = dict(self.formatter_fields_for_record(record))
|
||||
if record.flatten and isinstance(record.msg, dict):
|
||||
msg.update(record.msg)
|
||||
else:
|
||||
msg[MSG_FIELDNAME] = record.msg
|
||||
|
||||
if record.extra:
|
||||
msg.update(record.extra)
|
||||
if record.exc_info:
|
||||
msg["exc_info"] = record.exc_info
|
||||
if record.exc_text:
|
||||
msg["exc_text"] = record.exc_text
|
||||
|
||||
return self.serializer(
|
||||
msg, default=self._default_handler, **record.serializer_kwargs
|
||||
)
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,126 @@
|
||||
import abc
|
||||
import asyncio
|
||||
import json
|
||||
import sys
|
||||
from asyncio import AbstractEventLoop
|
||||
from typing import Optional, Union
|
||||
|
||||
from aiologger import settings
|
||||
from aiologger.utils import loop_compat
|
||||
from aiologger.filters import Filterer
|
||||
from aiologger.formatters.base import Formatter
|
||||
from aiologger.formatters.json import JsonFormatter
|
||||
from aiologger.levels import LogLevel, get_level_name, check_level
|
||||
from aiologger.records import LogRecord
|
||||
|
||||
|
||||
# Handler relies on any formatter
|
||||
_default_formatter = Formatter()
|
||||
|
||||
|
||||
@loop_compat
|
||||
class Handler(Filterer):
|
||||
"""
|
||||
Handler instances dispatch logging events to specific destinations.
|
||||
|
||||
The base handler class. Acts as a placeholder which defines the Handler
|
||||
interface. Handlers can optionally use Formatter instances to format
|
||||
records as desired. By default, no formatter is specified; in this case,
|
||||
the 'raw' message as determined by record.message is logged.
|
||||
"""
|
||||
|
||||
def __init__(self, level: LogLevel = LogLevel.NOTSET) -> None:
|
||||
"""
|
||||
Initializes the instance - basically setting the formatter to None
|
||||
and the filter list to empty.
|
||||
"""
|
||||
Filterer.__init__(self)
|
||||
self._level = check_level(level)
|
||||
self.formatter: Formatter = _default_formatter
|
||||
|
||||
@property
|
||||
@abc.abstractmethod
|
||||
def initialized(self):
|
||||
raise NotImplementedError()
|
||||
|
||||
@property
|
||||
def level(self):
|
||||
return self._level
|
||||
|
||||
@level.setter
|
||||
def level(self, value: Union[str, int, LogLevel]):
|
||||
"""
|
||||
Set the logging level of this handler.
|
||||
"""
|
||||
self._level = check_level(value)
|
||||
|
||||
@abc.abstractmethod
|
||||
async def emit(self, record: LogRecord) -> None:
|
||||
"""
|
||||
Do whatever it takes to actually log the specified logging record.
|
||||
|
||||
This version is intended to be implemented by subclasses and so
|
||||
raises a NotImplementedError.
|
||||
"""
|
||||
raise NotImplementedError(
|
||||
"emit must be implemented by Handler subclasses"
|
||||
)
|
||||
|
||||
async def handle(self, record: LogRecord) -> bool:
|
||||
"""
|
||||
Conditionally emit the specified logging record.
|
||||
|
||||
Emission depends on filters which may have been added to the handler.
|
||||
Returns whether the filter passed the record for emission.
|
||||
"""
|
||||
rv = self.filter(record)
|
||||
if rv:
|
||||
await self.emit(record)
|
||||
return rv
|
||||
|
||||
async def flush(self) -> None:
|
||||
"""
|
||||
Ensure all logging output has been flushed.
|
||||
|
||||
This version does nothing and is intended to be implemented by
|
||||
subclasses.
|
||||
"""
|
||||
pass
|
||||
|
||||
@abc.abstractmethod
|
||||
async def close(self) -> None:
|
||||
"""
|
||||
Tidy up any resources used by the handler.
|
||||
|
||||
This version removes the handler from an internal map of handlers,
|
||||
_handlers, which is used for handler lookup by name. Subclasses
|
||||
should ensure that this gets called from overridden close()
|
||||
methods.
|
||||
"""
|
||||
raise NotImplementedError(
|
||||
"close must be implemented by Handler subclasses"
|
||||
)
|
||||
|
||||
async def handle_error(
|
||||
self, record: LogRecord, exception: Exception
|
||||
) -> None:
|
||||
"""
|
||||
Handle errors which occur during an emit() call.
|
||||
|
||||
This method should be called from handlers when an exception is
|
||||
encountered during an emit() call. This is what is mostly wanted
|
||||
for a logging system - most users will not care about errors in
|
||||
the logging system, they are more interested in application errors.
|
||||
You could, however, replace this with a custom handler if you wish.
|
||||
The record which was being processed is passed in to this method.
|
||||
"""
|
||||
if not settings.HANDLE_ERROR_FALLBACK_ENABLED:
|
||||
return
|
||||
|
||||
msg = JsonFormatter.format_error_msg(record, exception)
|
||||
json.dump(msg, sys.stderr)
|
||||
sys.stderr.write("\n")
|
||||
|
||||
def __repr__(self):
|
||||
level = get_level_name(self.level)
|
||||
return f"<${self.__class__.__name__} (${level})>"
|
||||
@@ -0,0 +1,480 @@
|
||||
# The following code and documentation was inspired, and in some cases
|
||||
# copied and modified, from the work of Vinay Sajip and contributors
|
||||
# on cpython's logging package
|
||||
|
||||
import abc
|
||||
import asyncio
|
||||
import datetime
|
||||
import enum
|
||||
import os
|
||||
import re
|
||||
import time
|
||||
from asyncio import AbstractEventLoop
|
||||
from typing import Callable, List, Optional
|
||||
|
||||
import aiofiles
|
||||
from aiofiles.threadpool import AsyncTextIOWrapper
|
||||
|
||||
from aiologger.handlers.base import Handler
|
||||
from aiologger.records import LogRecord
|
||||
from aiologger.utils import classproperty, get_running_loop, loop_compat
|
||||
|
||||
|
||||
@loop_compat
|
||||
class AsyncFileHandler(Handler):
|
||||
terminator = "\n"
|
||||
|
||||
def __init__(
|
||||
self, filename: str, mode: str = "a", encoding: str = None
|
||||
) -> None:
|
||||
super().__init__()
|
||||
filename = os.fspath(filename)
|
||||
self.absolute_file_path = os.path.abspath(filename)
|
||||
self.mode = mode
|
||||
self.encoding = encoding
|
||||
self.stream: AsyncTextIOWrapper = None
|
||||
self._initialization_lock = None
|
||||
|
||||
@property
|
||||
def initialized(self):
|
||||
return self.stream is not None
|
||||
|
||||
async def _init_writer(self):
|
||||
"""
|
||||
Open the current base file with the (original) mode and encoding.
|
||||
"""
|
||||
if not self._initialization_lock:
|
||||
self._initialization_lock = asyncio.Lock()
|
||||
|
||||
async with self._initialization_lock:
|
||||
if not self.initialized:
|
||||
self.stream = await aiofiles.open(
|
||||
file=self.absolute_file_path,
|
||||
mode=self.mode,
|
||||
encoding=self.encoding,
|
||||
)
|
||||
|
||||
async def flush(self):
|
||||
await self.stream.flush()
|
||||
|
||||
async def close(self):
|
||||
if not self.initialized:
|
||||
return
|
||||
await self.stream.flush()
|
||||
await self.stream.close()
|
||||
self.stream = None
|
||||
self._initialization_lock = None
|
||||
|
||||
async def emit(self, record: LogRecord):
|
||||
if not self.initialized:
|
||||
await self._init_writer()
|
||||
|
||||
try:
|
||||
msg = self.formatter.format(record)
|
||||
|
||||
# Write order is not guaranteed. String concatenation required
|
||||
await self.stream.write(msg + self.terminator)
|
||||
|
||||
await self.stream.flush()
|
||||
except Exception as exc:
|
||||
await self.handle_error(record, exc)
|
||||
|
||||
|
||||
Namer = Callable[[str], str]
|
||||
Rotator = Callable[[str, str], None]
|
||||
|
||||
|
||||
class BaseAsyncRotatingFileHandler(AsyncFileHandler, metaclass=abc.ABCMeta):
|
||||
def __init__(
|
||||
self,
|
||||
filename: str,
|
||||
mode: str = "a",
|
||||
encoding: str = None,
|
||||
namer: Namer = None,
|
||||
rotator: Rotator = None,
|
||||
) -> None:
|
||||
super().__init__(filename, mode, encoding)
|
||||
self.mode = mode
|
||||
self.encoding = encoding
|
||||
self.namer = namer
|
||||
self.rotator = rotator
|
||||
self._rollover_lock: Optional[asyncio.Lock] = None
|
||||
|
||||
def should_rollover(self, record: LogRecord) -> bool:
|
||||
raise NotImplementedError
|
||||
|
||||
async def do_rollover(self):
|
||||
raise NotImplementedError
|
||||
|
||||
async def emit(self, record: LogRecord): # type: ignore
|
||||
"""
|
||||
Emit a record.
|
||||
|
||||
Output the record to the file, catering for rollover as described
|
||||
in `do_rollover`.
|
||||
"""
|
||||
try:
|
||||
if self.should_rollover(record):
|
||||
if not self._rollover_lock:
|
||||
self._rollover_lock = asyncio.Lock()
|
||||
|
||||
async with self._rollover_lock:
|
||||
if self.should_rollover(record):
|
||||
await self.do_rollover()
|
||||
await super().emit(record)
|
||||
except Exception as exc:
|
||||
await self.handle_error(record, exc)
|
||||
|
||||
def rotation_filename(self, default_name: str) -> str:
|
||||
"""
|
||||
Modify the filename of a log file when rotating.
|
||||
|
||||
This is provided so that a custom filename can be provided.
|
||||
|
||||
:param default_name: The default name for the log file.
|
||||
"""
|
||||
if self.namer is None:
|
||||
return default_name
|
||||
|
||||
return self.namer(default_name)
|
||||
|
||||
async def rotate(self, source: str, dest: str):
|
||||
"""
|
||||
When rotating, rotate the current log.
|
||||
|
||||
The default implementation calls the 'rotator' attribute of the
|
||||
handler, if it's callable, passing the source and dest arguments to
|
||||
it. If the attribute isn't callable (the default is None), the source
|
||||
is simply renamed to the destination.
|
||||
|
||||
:param source: The source filename. This is normally the base
|
||||
filename, e.g. 'test.log'
|
||||
:param dest: The destination filename. This is normally
|
||||
what the source is rotated to, e.g. 'test.log.1'.
|
||||
"""
|
||||
if self.rotator is None:
|
||||
# logging issue 18940: A file may not have been created if delay is True.
|
||||
loop = get_running_loop()
|
||||
if await loop.run_in_executor(None, lambda: os.path.exists(source)):
|
||||
await loop.run_in_executor( # type: ignore
|
||||
None, lambda: os.rename(source, dest)
|
||||
)
|
||||
else:
|
||||
self.rotator(source, dest)
|
||||
|
||||
|
||||
class RolloverInterval(str, enum.Enum):
|
||||
SECONDS = "S"
|
||||
MINUTES = "M"
|
||||
HOURS = "H"
|
||||
DAYS = "D"
|
||||
MONDAYS = "W0"
|
||||
TUESDAYS = "W1"
|
||||
WEDNESDAYS = "W2"
|
||||
THUERDAYS = "W3"
|
||||
FRIDAYS = "W4"
|
||||
SATURDAYS = "W5"
|
||||
SUNDAYS = "W6"
|
||||
MIDNIGHT = "MIDNIGHT"
|
||||
|
||||
@classproperty
|
||||
def WEEK_DAYS(cls):
|
||||
return (
|
||||
cls.MONDAYS,
|
||||
cls.TUESDAYS,
|
||||
cls.WEDNESDAYS,
|
||||
cls.THUERDAYS,
|
||||
cls.FRIDAYS,
|
||||
cls.SATURDAYS,
|
||||
cls.SUNDAYS,
|
||||
)
|
||||
|
||||
|
||||
ONE_MINUTE_IN_SECONDS = 60
|
||||
ONE_HOUR_IN_SECONDS = 60 * 60
|
||||
ONE_DAY_IN_SECONDS = ONE_HOUR_IN_SECONDS * 24
|
||||
ONE_WEEK_IN_SECONDS = 7 * ONE_DAY_IN_SECONDS
|
||||
|
||||
|
||||
class AsyncTimedRotatingFileHandler(BaseAsyncRotatingFileHandler):
|
||||
"""
|
||||
Handler for logging to a file, rotating the log file at certain timed
|
||||
intervals.
|
||||
|
||||
If `backup_count` is > 0, when rollover is done, no more than `backup_count`
|
||||
files are kept - the oldest ones are deleted.
|
||||
|
||||
| when | at_time behavior |
|
||||
|------------|--------------------------------------------------------|
|
||||
| SECONDS | at_time will be ignored |
|
||||
| MINUTES | -- // -- |
|
||||
| HOURS | -- // -- |
|
||||
| DAYS | at_time will be IGNORED. See also MIDNIGHT |
|
||||
| MONDAYS | rotation happens every WEEK on MONDAY at ${at_time} |
|
||||
| TUESDAYS | rotation happens every WEEK on TUESDAY at ${at_time} |
|
||||
| WEDNESDAYS | rotation happens every WEEK on WEDNESDAY at ${at_time} |
|
||||
| THUERDAYS | rotation happens every WEEK on THUERDAY at ${at_time} |
|
||||
| FRIDAYS | rotation happens every WEEK on FRIDAY at ${at_time} |
|
||||
| SATURDAYS | rotation happens every WEEK on SATURDAY at ${at_time} |
|
||||
| SUNDAYS | rotation happens every WEEK on SUNDAY at ${at_time} |
|
||||
| MIDNIGHT | rotation happens every DAY at ${at_time} |
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
filename: str,
|
||||
when: RolloverInterval = RolloverInterval.HOURS,
|
||||
interval: int = 1,
|
||||
backup_count: int = 0,
|
||||
encoding: str = None,
|
||||
utc: bool = False,
|
||||
at_time: datetime.time = None,
|
||||
) -> None:
|
||||
super().__init__(filename=filename, mode="a", encoding=encoding)
|
||||
self.when = when.upper()
|
||||
self.backup_count = backup_count
|
||||
self.utc = utc
|
||||
self.at_time = at_time
|
||||
# Calculate the real rollover interval, which is just the number of
|
||||
# seconds between rollovers. Also set the filename suffix used when
|
||||
# a rollover occurs. Current 'when' events supported:
|
||||
# S - Seconds
|
||||
# M - Minutes
|
||||
# H - Hours
|
||||
# D - Days
|
||||
# midnight - roll over at midnight
|
||||
# W{0-6} - roll over on a certain day; 0 - Monday
|
||||
#
|
||||
# Case of the 'when' specifier is not important; lower or upper case
|
||||
# will work.
|
||||
if self.when == RolloverInterval.SECONDS:
|
||||
self.interval = 1 # one second
|
||||
self.suffix = "%Y-%m-%d_%H-%M-%S"
|
||||
ext_match = r"^\d{4}-\d{2}-\d{2}_\d{2}-\d{2}-\d{2}(\.\w+)?$"
|
||||
elif self.when == RolloverInterval.MINUTES:
|
||||
self.interval = ONE_MINUTE_IN_SECONDS # one minute
|
||||
self.suffix = "%Y-%m-%d_%H-%M"
|
||||
ext_match = r"^\d{4}-\d{2}-\d{2}_\d{2}-\d{2}(\.\w+)?$"
|
||||
elif self.when == RolloverInterval.HOURS:
|
||||
self.interval = ONE_HOUR_IN_SECONDS # one hour
|
||||
self.suffix = "%Y-%m-%d_%H"
|
||||
ext_match = r"^\d{4}-\d{2}-\d{2}_\d{2}(\.\w+)?$"
|
||||
elif (
|
||||
self.when == RolloverInterval.DAYS
|
||||
or self.when == RolloverInterval.MIDNIGHT
|
||||
):
|
||||
self.interval = ONE_DAY_IN_SECONDS # one day
|
||||
self.suffix = "%Y-%m-%d"
|
||||
ext_match = r"^\d{4}-\d{2}-\d{2}(\.\w+)?$"
|
||||
elif self.when.startswith("W"):
|
||||
if self.when not in RolloverInterval.WEEK_DAYS:
|
||||
raise ValueError(
|
||||
f"Invalid day specified for weekly rollover: {self.when}"
|
||||
)
|
||||
self.interval = ONE_DAY_IN_SECONDS * 7 # one week
|
||||
self.day_of_week = int(self.when[1])
|
||||
self.suffix = "%Y-%m-%d"
|
||||
ext_match = r"^\d{4}-\d{2}-\d{2}(\.\w+)?$"
|
||||
else:
|
||||
raise ValueError(f"Invalid RolloverInterval specified: {self.when}")
|
||||
|
||||
self.ext_match = re.compile(ext_match, re.ASCII)
|
||||
self.interval = self.interval * interval # multiply by units requested
|
||||
# The following line added because the filename passed in could be a
|
||||
# path object (see Issue #27493), but self.baseFilename will be a string
|
||||
filename = self.absolute_file_path
|
||||
if os.path.exists(filename): # todo: IO. Remove or postpone
|
||||
t = int(os.stat(filename).st_mtime)
|
||||
else:
|
||||
t = int(time.time())
|
||||
self.rollover_at = self.compute_rollover(t)
|
||||
|
||||
def compute_rollover(self, current_time: int) -> int:
|
||||
"""
|
||||
Work out the rollover time based on the specified time.
|
||||
|
||||
If we are rolling over at midnight or weekly, then the interval is
|
||||
already known. need to figure out is WHEN the next interval is.
|
||||
In other words, if you are rolling over at midnight, then your base
|
||||
interval is 1 day, but you want to start that one day clock at midnight,
|
||||
not now. So, we have to fudge the `rollover_at` value in order to trigger
|
||||
the first rollover at the right time. After that, the regular interval
|
||||
will take care of the rest. Note that this code doesn't care about
|
||||
leap seconds. :)
|
||||
"""
|
||||
result = current_time + self.interval
|
||||
|
||||
if (
|
||||
self.when == RolloverInterval.MIDNIGHT
|
||||
or self.when in RolloverInterval.WEEK_DAYS
|
||||
):
|
||||
if self.utc:
|
||||
t = time.gmtime(current_time)
|
||||
else:
|
||||
t = time.localtime(current_time)
|
||||
current_hour = t[3]
|
||||
current_minute = t[4]
|
||||
current_second = t[5]
|
||||
current_day = t[6]
|
||||
# r is the number of seconds left between now and the next rotation
|
||||
if self.at_time is None:
|
||||
rotate_ts = ONE_DAY_IN_SECONDS
|
||||
else:
|
||||
rotate_ts = (
|
||||
self.at_time.hour * 60 + self.at_time.minute
|
||||
) * 60 + self.at_time.second
|
||||
|
||||
r = rotate_ts - (
|
||||
(current_hour * 60 + current_minute) * 60 + current_second
|
||||
)
|
||||
if r < 0:
|
||||
# Rotate time is before the current time (for example when
|
||||
# self.rotateAt is 13:45 and it now 14:15), rotation is
|
||||
# tomorrow.
|
||||
r += ONE_DAY_IN_SECONDS
|
||||
current_day = (current_day + 1) % 7
|
||||
result = current_time + r
|
||||
# If we are rolling over on a certain day, add in the number of days until
|
||||
# the next rollover, but offset by 1 since we just calculated the time
|
||||
# until the next day starts. There are three cases:
|
||||
# Case 1) The day to rollover is today; in this case, do nothing
|
||||
# Case 2) The day to rollover is further in the interval (i.e., today is
|
||||
# day 2 (Wednesday) and rollover is on day 6 (Sunday). Days to
|
||||
# next rollover is simply 6 - 2 - 1, or 3.
|
||||
# Case 3) The day to rollover is behind us in the interval (i.e., today
|
||||
# is day 5 (Saturday) and rollover is on day 3 (Thursday).
|
||||
# Days to rollover is 6 - 5 + 3, or 4. In this case, it's the
|
||||
# number of days left in the current week (1) plus the number
|
||||
# of days in the next week until the rollover day (3).
|
||||
# The calculations described in 2) and 3) above need to have a day added.
|
||||
# This is because the above time calculation takes us to midnight on this
|
||||
# day, i.e. the start of the next day.
|
||||
if self.when in RolloverInterval.WEEK_DAYS:
|
||||
day = current_day # 0 is Monday
|
||||
if day != self.day_of_week:
|
||||
if day < self.day_of_week:
|
||||
days_to_wait = self.day_of_week - day
|
||||
else:
|
||||
days_to_wait = 6 - day + self.day_of_week + 1
|
||||
new_rollover_at = result + (
|
||||
days_to_wait * ONE_DAY_IN_SECONDS
|
||||
)
|
||||
if not self.utc:
|
||||
dst_now = t[-1]
|
||||
dst_at_rollover = time.localtime(new_rollover_at)[-1]
|
||||
if dst_now != dst_at_rollover:
|
||||
if not dst_now:
|
||||
# DST kicks in before next rollover, so we need to deduct an hour
|
||||
new_rollover_at -= ONE_HOUR_IN_SECONDS
|
||||
else:
|
||||
# DST bows out before next rollover, so we need to add an hour
|
||||
new_rollover_at += ONE_HOUR_IN_SECONDS
|
||||
result = new_rollover_at
|
||||
return result
|
||||
|
||||
def should_rollover(self, record: LogRecord) -> bool:
|
||||
"""
|
||||
Determine if rollover should occur.
|
||||
|
||||
record is not used, as we are just comparing times, but it is needed so
|
||||
the method signatures are the same
|
||||
"""
|
||||
t = int(time.time())
|
||||
if t >= self.rollover_at:
|
||||
return True
|
||||
return False
|
||||
|
||||
async def get_files_to_delete(self) -> List[str]:
|
||||
"""
|
||||
Determine the files to delete when rolling over.
|
||||
"""
|
||||
dir_name, base_name = os.path.split(self.absolute_file_path)
|
||||
loop = get_running_loop()
|
||||
file_names = await loop.run_in_executor(
|
||||
None, lambda: os.listdir(dir_name)
|
||||
)
|
||||
result = []
|
||||
prefix = base_name + "."
|
||||
plen = len(prefix)
|
||||
for file_name in file_names:
|
||||
if file_name[:plen] == prefix:
|
||||
suffix = file_name[plen:]
|
||||
if self.ext_match.match(suffix):
|
||||
result.append(os.path.join(dir_name, file_name))
|
||||
if len(result) < self.backup_count:
|
||||
return []
|
||||
else:
|
||||
result.sort(reverse=True) # os.listdir order is not defined
|
||||
return result[: len(result) - self.backup_count]
|
||||
|
||||
async def _delete_files(self, file_paths: List[str]):
|
||||
loop = get_running_loop()
|
||||
delete_tasks = (
|
||||
loop.run_in_executor(None, lambda: os.unlink(file_path))
|
||||
for file_path in file_paths
|
||||
)
|
||||
await asyncio.gather(*delete_tasks)
|
||||
|
||||
async def do_rollover(self):
|
||||
"""
|
||||
do a rollover; in this case, a date/time stamp is appended to the filename
|
||||
when the rollover happens. However, you want the file to be named for the
|
||||
start of the interval, not the current time. If there is a backup count,
|
||||
then we have to get a list of matching filenames, sort them and remove
|
||||
the one with the oldest suffix.
|
||||
"""
|
||||
if self.stream:
|
||||
await self.stream.close()
|
||||
self.stream = None
|
||||
# get the time that this sequence started at and make it a TimeTuple
|
||||
current_time = int(time.time())
|
||||
dst_now = time.localtime(current_time)[-1]
|
||||
t = self.rollover_at - self.interval
|
||||
if self.utc:
|
||||
time_tuple = time.gmtime(t)
|
||||
else:
|
||||
time_tuple = time.localtime(t)
|
||||
dst_then = time_tuple[-1]
|
||||
if dst_now != dst_then:
|
||||
if dst_now:
|
||||
addend = ONE_HOUR_IN_SECONDS
|
||||
else:
|
||||
addend = -ONE_HOUR_IN_SECONDS
|
||||
time_tuple = time.localtime(t + addend)
|
||||
destination_file_path = self.rotation_filename(
|
||||
self.absolute_file_path
|
||||
+ "."
|
||||
+ time.strftime(self.suffix, time_tuple)
|
||||
)
|
||||
loop = get_running_loop()
|
||||
if await loop.run_in_executor(
|
||||
None, lambda: os.path.exists(destination_file_path)
|
||||
):
|
||||
await loop.run_in_executor(
|
||||
None, lambda: os.unlink(destination_file_path)
|
||||
)
|
||||
await self.rotate(self.absolute_file_path, destination_file_path)
|
||||
if self.backup_count > 0:
|
||||
files_to_delete = await self.get_files_to_delete()
|
||||
if files_to_delete:
|
||||
await self._delete_files(files_to_delete)
|
||||
|
||||
await self._init_writer()
|
||||
new_rollover_at = self.compute_rollover(current_time)
|
||||
while new_rollover_at <= current_time:
|
||||
new_rollover_at = new_rollover_at + self.interval
|
||||
# If DST changes and midnight or weekly rollover, adjust for this.
|
||||
if (
|
||||
self.when == RolloverInterval.MIDNIGHT
|
||||
or self.when in RolloverInterval.WEEK_DAYS
|
||||
) and not self.utc:
|
||||
dst_at_rollover = time.localtime(new_rollover_at)[-1]
|
||||
if dst_now != dst_at_rollover:
|
||||
if not dst_now:
|
||||
# DST kicks in before next rollover, so we need to deduct an hour
|
||||
addend = -ONE_HOUR_IN_SECONDS
|
||||
else:
|
||||
# DST bows out before next rollover, so we need to add an hour
|
||||
addend = ONE_HOUR_IN_SECONDS
|
||||
new_rollover_at += addend
|
||||
self.rollover_at = new_rollover_at
|
||||
@@ -0,0 +1,98 @@
|
||||
import asyncio
|
||||
import sys
|
||||
from asyncio import AbstractEventLoop, StreamWriter
|
||||
from typing import Union, Optional
|
||||
|
||||
from aiologger.utils import get_running_loop, loop_compat
|
||||
from aiologger.filters import Filter
|
||||
from aiologger.formatters.base import Formatter
|
||||
from aiologger.handlers.base import Handler
|
||||
from aiologger.levels import LogLevel
|
||||
from aiologger.protocols import AiologgerProtocol
|
||||
from aiologger.records import LogRecord
|
||||
|
||||
|
||||
@loop_compat
|
||||
class AsyncStreamHandler(Handler):
|
||||
terminator = "\n"
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
stream=None,
|
||||
level: Union[str, int, LogLevel] = LogLevel.NOTSET,
|
||||
formatter: Formatter = None,
|
||||
filter: Filter = None,
|
||||
) -> None:
|
||||
super().__init__()
|
||||
if stream is None:
|
||||
stream = sys.stderr
|
||||
self.stream = stream
|
||||
self.level = level
|
||||
if formatter is None:
|
||||
formatter = Formatter()
|
||||
self.formatter: Formatter = formatter
|
||||
if filter:
|
||||
self.add_filter(filter)
|
||||
self.protocol_class = AiologgerProtocol
|
||||
self._initialization_lock = asyncio.Lock()
|
||||
self.writer: Optional[StreamWriter] = None
|
||||
|
||||
@property
|
||||
def initialized(self):
|
||||
return self.writer is not None
|
||||
|
||||
async def _init_writer(self) -> StreamWriter:
|
||||
async with self._initialization_lock:
|
||||
if self.writer is not None:
|
||||
return self.writer
|
||||
|
||||
loop = get_running_loop()
|
||||
transport, protocol = await loop.connect_write_pipe(
|
||||
self.protocol_class, self.stream
|
||||
)
|
||||
|
||||
self.writer = StreamWriter( # type: ignore # https://github.com/python/typeshed/pull/2719
|
||||
transport=transport, protocol=protocol, reader=None, loop=loop
|
||||
)
|
||||
return self.writer
|
||||
|
||||
async def handle(self, record: LogRecord) -> bool:
|
||||
"""
|
||||
Conditionally emit the specified logging record.
|
||||
Emission depends on filters which may have been added to the handler.
|
||||
"""
|
||||
rv = self.filter(record)
|
||||
if rv:
|
||||
await self.emit(record)
|
||||
return rv
|
||||
|
||||
async def flush(self):
|
||||
await self.writer.drain()
|
||||
|
||||
async def emit(self, record: LogRecord):
|
||||
"""
|
||||
Actually log the specified logging record to the stream.
|
||||
"""
|
||||
if self.writer is None:
|
||||
self.writer = await self._init_writer()
|
||||
|
||||
try:
|
||||
msg = self.formatter.format(record) + self.terminator
|
||||
|
||||
self.writer.write(msg.encode())
|
||||
await self.writer.drain()
|
||||
except Exception as exc:
|
||||
await self.handle_error(record, exc)
|
||||
|
||||
async def close(self):
|
||||
"""
|
||||
Tidy up any resources used by the handler.
|
||||
|
||||
This version removes the handler from an internal map of handlers,
|
||||
should ensure that this gets called from overridden close()
|
||||
methods.
|
||||
"""
|
||||
if self.writer is None:
|
||||
return
|
||||
await self.flush()
|
||||
self.writer.close()
|
||||
@@ -0,0 +1,47 @@
|
||||
import enum
|
||||
from typing import Union
|
||||
|
||||
|
||||
class LogLevel(enum.IntEnum):
|
||||
CRITICAL = 50
|
||||
FATAL = CRITICAL
|
||||
ERROR = 40
|
||||
WARNING = 30
|
||||
WARN = WARNING
|
||||
INFO = 20
|
||||
DEBUG = 10
|
||||
NOTSET = 0
|
||||
|
||||
|
||||
NAME_TO_LEVEL = {level: LogLevel[level].value for level in LogLevel.__members__}
|
||||
LEVEL_TO_NAME = {level.value: level.name for level in LogLevel}
|
||||
|
||||
|
||||
def get_level_name(level: Union[int, LogLevel]) -> str:
|
||||
"""
|
||||
Return the textual representation of logging level 'level'.
|
||||
|
||||
If the level is one of the predefined levels (CRITICAL, ERROR, WARNING,
|
||||
INFO, DEBUG) then you get the corresponding string.
|
||||
|
||||
If a numeric value corresponding to one of the defined levels is passed
|
||||
in, the corresponding string representation is returned.
|
||||
"""
|
||||
try:
|
||||
return LEVEL_TO_NAME[level]
|
||||
except KeyError as e:
|
||||
raise ValueError(f"Unkown level name: {level}") from e
|
||||
|
||||
|
||||
def check_level(level: Union[str, int, LogLevel]) -> int:
|
||||
if isinstance(level, int):
|
||||
if level not in LEVEL_TO_NAME:
|
||||
raise ValueError(f"Unknown level: {level}")
|
||||
return level
|
||||
elif isinstance(level, str):
|
||||
try:
|
||||
return NAME_TO_LEVEL[level]
|
||||
except KeyError:
|
||||
raise ValueError(f"Unknown level: {level}")
|
||||
else:
|
||||
raise TypeError(f"Level not an Union[str, int, LogLevel]: {level}")
|
||||
@@ -0,0 +1,344 @@
|
||||
import asyncio
|
||||
import io
|
||||
import sys
|
||||
import traceback
|
||||
from asyncio import AbstractEventLoop, Task
|
||||
from typing import Iterable, Optional, Callable, Awaitable, List, NamedTuple
|
||||
|
||||
from aiologger.filters import StdoutFilter, Filterer
|
||||
from aiologger.formatters.base import Formatter
|
||||
from aiologger.handlers.base import Handler
|
||||
from aiologger.handlers.streams import AsyncStreamHandler
|
||||
from aiologger.levels import LogLevel, check_level
|
||||
from aiologger.records import LogRecord
|
||||
from aiologger.utils import (
|
||||
get_current_frame,
|
||||
create_task,
|
||||
loop_compat,
|
||||
bind_loop,
|
||||
)
|
||||
|
||||
_HandlerFactory = Callable[[], Awaitable[Iterable[Handler]]]
|
||||
|
||||
|
||||
class _Caller(NamedTuple):
|
||||
filename: str
|
||||
line_number: int
|
||||
function_name: str
|
||||
stack: Optional[str]
|
||||
|
||||
|
||||
def o_o():
|
||||
"""
|
||||
Ordinarily we would use __file__ for this, but frozen modules don't always
|
||||
have __file__ set, for some reason (see Issue logging#21736). Thus, we get
|
||||
the filename from a handy code object from a function defined in this
|
||||
module.
|
||||
"""
|
||||
raise NotImplementedError(
|
||||
"I shouldn't be called. My only purpose is to provide "
|
||||
"the filename from a handy code object."
|
||||
)
|
||||
|
||||
|
||||
# _srcfile is used when walking the stack to check when we've got the first
|
||||
# caller stack frame, by skipping frames whose filename is that of this
|
||||
# module's source. It therefore should contain the filename of this module's
|
||||
# source file.
|
||||
_srcfile = o_o.__code__.co_filename
|
||||
|
||||
|
||||
@loop_compat
|
||||
class Logger(Filterer):
|
||||
def __init__(self, *, name="aiologger", level=LogLevel.NOTSET) -> None:
|
||||
super(Logger, self).__init__()
|
||||
self.name = name
|
||||
self.level = check_level(level)
|
||||
self.parent = None
|
||||
self.propagate = True
|
||||
self.handlers: List[Handler] = []
|
||||
self.disabled = False
|
||||
self._was_shutdown = False
|
||||
|
||||
self._dummy_task: Optional[Task] = None
|
||||
|
||||
@classmethod
|
||||
def with_default_handlers(
|
||||
cls,
|
||||
*,
|
||||
name="aiologger",
|
||||
level=LogLevel.NOTSET,
|
||||
formatter: Optional[Formatter] = None,
|
||||
**kwargs,
|
||||
):
|
||||
self = cls(name=name, level=level, **kwargs) # type: ignore
|
||||
|
||||
_AsyncStreamHandler = bind_loop(AsyncStreamHandler, kwargs)
|
||||
self.add_handler(
|
||||
_AsyncStreamHandler(
|
||||
stream=sys.stdout,
|
||||
level=LogLevel.DEBUG,
|
||||
formatter=formatter,
|
||||
filter=StdoutFilter(),
|
||||
)
|
||||
)
|
||||
self.add_handler(
|
||||
_AsyncStreamHandler(
|
||||
stream=sys.stderr, level=LogLevel.WARNING, formatter=formatter
|
||||
)
|
||||
)
|
||||
|
||||
return self
|
||||
|
||||
def find_caller(self, stack_info=False) -> _Caller:
|
||||
"""
|
||||
Find the stack frame of the caller so that we can note the source
|
||||
file name, line number and function name.
|
||||
"""
|
||||
frame = get_current_frame()
|
||||
# On some versions of IronPython, currentframe() returns None if
|
||||
# IronPython isn't run with -X:Frames.
|
||||
if frame is not None:
|
||||
frame = frame.f_back
|
||||
while hasattr(frame, "f_code"):
|
||||
code = frame.f_code
|
||||
filename = code.co_filename
|
||||
if filename == _srcfile:
|
||||
frame = frame.f_back
|
||||
continue
|
||||
sinfo = None
|
||||
if stack_info:
|
||||
sio = io.StringIO()
|
||||
sio.write("Stack (most recent call last):\n")
|
||||
traceback.print_stack(frame, file=sio)
|
||||
sinfo = sio.getvalue()
|
||||
if sinfo[-1] == "\n":
|
||||
sinfo = sinfo[:-1]
|
||||
sio.close()
|
||||
return _Caller(
|
||||
filename=code.co_filename or "(unknown file)",
|
||||
line_number=frame.f_lineno,
|
||||
function_name=code.co_name,
|
||||
stack=sinfo,
|
||||
)
|
||||
return _Caller(
|
||||
filename="(unknown file)",
|
||||
line_number=0,
|
||||
function_name="(unknown function)",
|
||||
stack=None,
|
||||
)
|
||||
|
||||
async def call_handlers(self, record):
|
||||
"""
|
||||
Pass a record to all relevant handlers.
|
||||
|
||||
Loop through all handlers for this logger and its parents in the
|
||||
logger hierarchy. If no handler was found, raises an error. Stop
|
||||
searching up the hierarchy whenever a logger with the "propagate"
|
||||
attribute set to zero is found - that will be the last logger
|
||||
whose handlers are called.
|
||||
"""
|
||||
c = self
|
||||
found = 0
|
||||
while c:
|
||||
for handler in c.handlers:
|
||||
found = found + 1
|
||||
if record.levelno >= handler.level:
|
||||
await handler.handle(record)
|
||||
if not c.propagate:
|
||||
c = None # break out
|
||||
else:
|
||||
c = c.parent
|
||||
if found == 0:
|
||||
raise Exception("No handlers could be found for logger")
|
||||
|
||||
def add_handler(self, handler: Handler) -> None:
|
||||
"""
|
||||
Add the specified handler to this logger.
|
||||
"""
|
||||
if not (handler in self.handlers):
|
||||
self.handlers.append(handler)
|
||||
|
||||
def remove_handler(self, handler: Handler) -> None:
|
||||
"""
|
||||
Remove the specified handler from this logger.
|
||||
"""
|
||||
if handler in self.handlers:
|
||||
self.handlers.remove(handler)
|
||||
|
||||
async def handle(self, record):
|
||||
"""
|
||||
Call the handlers for the specified record.
|
||||
|
||||
This method is used for unpickled records received from a socket, as
|
||||
well as those created locally. Logger-level filtering is applied.
|
||||
"""
|
||||
if (not self.disabled) and self.filter(record):
|
||||
await self.call_handlers(record)
|
||||
|
||||
def _log(
|
||||
self,
|
||||
level,
|
||||
msg,
|
||||
args,
|
||||
exc_info=None,
|
||||
extra=None,
|
||||
stack_info=False,
|
||||
caller: _Caller = None,
|
||||
) -> Task:
|
||||
|
||||
sinfo = None
|
||||
if _srcfile and caller is None: # type: ignore
|
||||
# IronPython doesn't track Python frames, so find_caller raises an
|
||||
# exception on some versions of IronPython. We trap it here so that
|
||||
# IronPython can use logging.
|
||||
try:
|
||||
fn, lno, func, sinfo = self.find_caller(stack_info)
|
||||
except ValueError: # pragma: no cover
|
||||
fn, lno, func = "(unknown file)", 0, "(unknown function)"
|
||||
elif caller:
|
||||
fn, lno, func, sinfo = caller
|
||||
else: # pragma: no cover
|
||||
fn, lno, func = "(unknown file)", 0, "(unknown function)"
|
||||
if exc_info and isinstance(exc_info, BaseException):
|
||||
exc_info = (type(exc_info), exc_info, exc_info.__traceback__)
|
||||
|
||||
record = LogRecord( # type: ignore
|
||||
name=self.name,
|
||||
level=level,
|
||||
pathname=fn,
|
||||
lineno=lno,
|
||||
msg=msg,
|
||||
args=args,
|
||||
exc_info=exc_info,
|
||||
func=func,
|
||||
sinfo=sinfo,
|
||||
extra=extra,
|
||||
)
|
||||
return create_task(self.handle(record))
|
||||
|
||||
def __make_dummy_task(self) -> Task:
|
||||
async def _dummy(*args, **kwargs):
|
||||
return
|
||||
|
||||
return create_task(_dummy())
|
||||
|
||||
def is_enabled_for(self, level) -> bool:
|
||||
return level >= self.level
|
||||
|
||||
def _make_log_task(self, level, msg, *args, **kwargs) -> Task:
|
||||
"""
|
||||
Creates an asyncio.Task for a msg if logging is enabled for level.
|
||||
Returns a dummy task otherwise.
|
||||
"""
|
||||
if not self.is_enabled_for(level):
|
||||
if self._dummy_task is None:
|
||||
self._dummy_task = self.__make_dummy_task()
|
||||
return self._dummy_task
|
||||
|
||||
if kwargs.get("exc_info", False):
|
||||
if not isinstance(kwargs["exc_info"], BaseException):
|
||||
kwargs["exc_info"] = sys.exc_info()
|
||||
|
||||
return self._log( # type: ignore
|
||||
level, msg, *args, caller=self.find_caller(False), **kwargs
|
||||
)
|
||||
|
||||
def debug(self, msg, *args, **kwargs) -> Task:
|
||||
"""
|
||||
Log msg with severity 'DEBUG'.
|
||||
|
||||
To pass exception information, use the keyword argument exc_info with
|
||||
a true value, e.g.
|
||||
|
||||
await logger.debug("Houston, we have a %s", "thorny problem", exc_info=1)
|
||||
"""
|
||||
return self._make_log_task(LogLevel.DEBUG, msg, args, **kwargs)
|
||||
|
||||
def info(self, msg, *args, **kwargs) -> Task:
|
||||
"""
|
||||
Log msg with severity 'INFO'.
|
||||
|
||||
To pass exception information, use the keyword argument exc_info with
|
||||
a true value, e.g.
|
||||
|
||||
await logger.info("Houston, we have an interesting problem", exc_info=1)
|
||||
"""
|
||||
return self._make_log_task(LogLevel.INFO, msg, args, **kwargs)
|
||||
|
||||
def warning(self, msg, *args, **kwargs) -> Task:
|
||||
"""
|
||||
Log msg with severity 'WARNING'.
|
||||
|
||||
To pass exception information, use the keyword argument exc_info with
|
||||
a true value, e.g.
|
||||
|
||||
await logger.warning("Houston, we have a bit of a problem", exc_info=1)
|
||||
"""
|
||||
return self._make_log_task(LogLevel.WARNING, msg, args, **kwargs)
|
||||
|
||||
warn = warning
|
||||
|
||||
def error(self, msg, *args, **kwargs) -> Task:
|
||||
"""
|
||||
Log msg with severity 'ERROR'.
|
||||
|
||||
To pass exception information, use the keyword argument exc_info with
|
||||
a true value, e.g.
|
||||
|
||||
await logger.error("Houston, we have a major problem", exc_info=1)
|
||||
"""
|
||||
return self._make_log_task(LogLevel.ERROR, msg, args, **kwargs)
|
||||
|
||||
def critical(self, msg, *args, **kwargs) -> Task:
|
||||
"""
|
||||
Log msg with severity 'CRITICAL'.
|
||||
|
||||
To pass exception information, use the keyword argument exc_info with
|
||||
a true value, e.g.
|
||||
|
||||
await logger.critical("Houston, we have a major disaster", exc_info=1)
|
||||
"""
|
||||
return self._make_log_task(LogLevel.CRITICAL, msg, args, **kwargs)
|
||||
|
||||
fatal = critical
|
||||
|
||||
def exception(self, msg, *args, exc_info=True, **kwargs) -> Task:
|
||||
"""
|
||||
Convenience method for logging an ERROR with exception information.
|
||||
"""
|
||||
return self.error(msg, *args, exc_info=exc_info, **kwargs)
|
||||
|
||||
async def shutdown(self):
|
||||
"""
|
||||
Perform any cleanup actions in the logging system (e.g. flushing
|
||||
buffers).
|
||||
|
||||
Should be called at application exit.
|
||||
"""
|
||||
if self._was_shutdown:
|
||||
return
|
||||
self._was_shutdown = True
|
||||
await self._do_shutdown()
|
||||
|
||||
async def _do_shutdown(self):
|
||||
"""
|
||||
Does actual shutdown
|
||||
"""
|
||||
for handler in reversed(self.handlers):
|
||||
if not handler:
|
||||
continue
|
||||
try:
|
||||
if handler.initialized:
|
||||
await handler.flush()
|
||||
await handler.close()
|
||||
|
||||
except Exception:
|
||||
"""
|
||||
Ignore errors which might be caused
|
||||
because handlers have been closed but
|
||||
references to them are still around at
|
||||
application exit. Basically ignore everything,
|
||||
as we're shutting down
|
||||
"""
|
||||
pass
|
||||
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,112 @@
|
||||
import json
|
||||
from datetime import timezone
|
||||
from asyncio import AbstractEventLoop, Task
|
||||
from typing import Dict, Iterable, Callable, Tuple, Any, Optional, Mapping
|
||||
|
||||
from aiologger import Logger
|
||||
from aiologger.utils import create_task, loop_compat
|
||||
from aiologger.formatters.base import Formatter
|
||||
from aiologger.formatters.json import ExtendedJsonFormatter
|
||||
from aiologger.levels import LogLevel
|
||||
from aiologger.logger import _Caller
|
||||
from aiologger.records import ExtendedLogRecord
|
||||
|
||||
|
||||
@loop_compat
|
||||
class JsonLogger(Logger):
|
||||
def __init__(
|
||||
self,
|
||||
name: str = "aiologger-json",
|
||||
level: int = LogLevel.DEBUG,
|
||||
flatten: bool = False,
|
||||
serializer_kwargs: Dict = None,
|
||||
extra: Dict = None,
|
||||
) -> None:
|
||||
super().__init__(name=name, level=level)
|
||||
|
||||
self.flatten = flatten
|
||||
|
||||
if serializer_kwargs is None:
|
||||
serializer_kwargs = {}
|
||||
self.serializer_kwargs = serializer_kwargs
|
||||
|
||||
if extra is None:
|
||||
extra = {}
|
||||
self.extra = extra
|
||||
|
||||
@classmethod
|
||||
def with_default_handlers( # type: ignore
|
||||
cls,
|
||||
*,
|
||||
name: str = "aiologger-json",
|
||||
level: int = LogLevel.NOTSET,
|
||||
serializer: Callable[..., str] = json.dumps,
|
||||
flatten: bool = False,
|
||||
serializer_kwargs: Dict = None,
|
||||
extra: Dict = None,
|
||||
exclude_fields: Iterable[str] = None,
|
||||
tz: timezone = None,
|
||||
formatter: Optional[Formatter] = None,
|
||||
**kwargs,
|
||||
):
|
||||
if formatter is None:
|
||||
formatter = ExtendedJsonFormatter(
|
||||
serializer=serializer, exclude_fields=exclude_fields, tz=tz
|
||||
)
|
||||
return super(JsonLogger, cls).with_default_handlers(
|
||||
name=name,
|
||||
level=level,
|
||||
flatten=flatten,
|
||||
serializer_kwargs=serializer_kwargs,
|
||||
extra=extra,
|
||||
formatter=formatter,
|
||||
**kwargs,
|
||||
)
|
||||
|
||||
def _log( # type: ignore
|
||||
self,
|
||||
level: LogLevel,
|
||||
msg: Any,
|
||||
args: Optional[Tuple[Mapping]],
|
||||
exc_info=None,
|
||||
extra: Dict = None,
|
||||
stack_info=False,
|
||||
flatten: bool = False,
|
||||
serializer_kwargs: Dict = None,
|
||||
caller: _Caller = None,
|
||||
) -> Task:
|
||||
"""
|
||||
Low-level logging routine which creates a ExtendedLogRecord and
|
||||
then calls all the handlers of this logger to handle the record.
|
||||
|
||||
Overwritten to properly handle log methods kwargs
|
||||
"""
|
||||
sinfo = None
|
||||
if caller:
|
||||
fn, lno, func, sinfo = caller
|
||||
else: # pragma: no cover
|
||||
fn, lno, func = "(unknown file)", 0, "(unknown function)"
|
||||
if exc_info and isinstance(exc_info, BaseException):
|
||||
exc_info = (type(exc_info), exc_info, exc_info.__traceback__)
|
||||
|
||||
joined_extra = {}
|
||||
joined_extra.update(self.extra)
|
||||
|
||||
if extra:
|
||||
joined_extra.update(extra)
|
||||
|
||||
record = ExtendedLogRecord(
|
||||
name=self.name,
|
||||
level=level,
|
||||
pathname=fn,
|
||||
lineno=lno,
|
||||
msg=msg,
|
||||
args=args,
|
||||
exc_info=exc_info,
|
||||
func=func,
|
||||
sinfo=sinfo,
|
||||
extra=joined_extra,
|
||||
flatten=flatten or self.flatten,
|
||||
serializer_kwargs=serializer_kwargs or self.serializer_kwargs,
|
||||
)
|
||||
return create_task(self.handle(record))
|
||||
@@ -0,0 +1,6 @@
|
||||
import asyncio
|
||||
|
||||
|
||||
class AiologgerProtocol(asyncio.Protocol):
|
||||
async def _drain_helper(self):
|
||||
pass
|
||||
@@ -0,0 +1,136 @@
|
||||
# The following code and documentation was inspired, and in some cases
|
||||
# copied and modified, from the work of Vinay Sajip and contributors
|
||||
# on cpython's logging package
|
||||
import os
|
||||
import time
|
||||
import types
|
||||
from collections.abc import Mapping
|
||||
from typing import Optional, Tuple, Type
|
||||
|
||||
from aiologger.levels import LogLevel, get_level_name
|
||||
|
||||
ExceptionInfo = Tuple[Type[BaseException], BaseException, types.TracebackType]
|
||||
|
||||
|
||||
class LogRecord:
|
||||
"""
|
||||
A LogRecord instance represents an event being logged.
|
||||
|
||||
ExtendedLogRecord instances are created every time something is logged. They
|
||||
contain all the information pertinent to the event being logged. The
|
||||
main information passed in is in msg and args, which are combined
|
||||
using str(msg) % args to create the message field of the record. The
|
||||
record also includes information such as when the record was created,
|
||||
the source line where the logging call was made, and any exception
|
||||
information to be logged.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
name: str,
|
||||
level: LogLevel,
|
||||
pathname: str,
|
||||
lineno: int,
|
||||
msg,
|
||||
args: Optional[Tuple[Mapping]] = None,
|
||||
exc_info: Optional[ExceptionInfo] = None,
|
||||
func: Optional[str] = None,
|
||||
sinfo: Optional[str] = None,
|
||||
**kwargs,
|
||||
) -> None:
|
||||
"""
|
||||
:param name: The name of the logger used to log the event represented
|
||||
by this LogRecord. Note that this name will always have this value,
|
||||
even though it may be emitted by a handler attached to a
|
||||
different (ancestor) logger.
|
||||
:param level: The numeric level of the logging event (one of DEBUG,
|
||||
INFO etc.) Note that this is converted to two attributes of the
|
||||
LogRecord: levelno for the numeric value and levelname for the
|
||||
corresponding level name.
|
||||
:param pathname: The full pathname of the source file where the
|
||||
logging call was made.
|
||||
:param lineno: The line number in the source file where the logging
|
||||
call was made.
|
||||
:param msg: The event description message, possibly a format string
|
||||
with placeholders for variable data.
|
||||
:param args: Variable data to merge into the msg argument to obtain
|
||||
the event description.
|
||||
:param exc_info: An exception tuple with the current exception
|
||||
information, or None if no exception information is available.
|
||||
:param func: The name of the function or method from which the
|
||||
logging call was invoked.
|
||||
:param sinfo: A text string representing stack information from the
|
||||
base of the stack in the current thread, up to the logging call.
|
||||
"""
|
||||
created_at = time.time()
|
||||
self.name = name
|
||||
self.msg = msg
|
||||
self.args: Optional[Mapping]
|
||||
if args:
|
||||
if len(args) != 1 or not isinstance(args[0], Mapping):
|
||||
raise ValueError(
|
||||
f"Invalid LogRecord args type: {type(args[0])}. "
|
||||
f"Expected Mapping"
|
||||
)
|
||||
self.args: Optional[Mapping] = args[0]
|
||||
else:
|
||||
self.args = args
|
||||
self.levelname = get_level_name(level)
|
||||
self.levelno = level
|
||||
self.pathname = pathname
|
||||
try:
|
||||
self.filename = os.path.basename(pathname)
|
||||
self.module = os.path.splitext(self.filename)[0]
|
||||
except (TypeError, ValueError, AttributeError):
|
||||
self.filename = pathname
|
||||
self.module = "Unknown module"
|
||||
self.exc_info = exc_info
|
||||
self.exc_text: Optional[str] = None # used to cache the traceback text
|
||||
self.stack_info = sinfo
|
||||
self.lineno = lineno
|
||||
self.funcName = func
|
||||
self.created = created_at
|
||||
self.msecs = (created_at - int(created_at)) * 1000
|
||||
self.process = os.getpid()
|
||||
self.asctime: Optional[str] = None
|
||||
self.message: Optional[str] = None
|
||||
|
||||
def __str__(self):
|
||||
return (
|
||||
f"<{self.__class__.__name__}: {self.name}, {self.levelname}, "
|
||||
f'{self.pathname}, {self.lineno}, "{self.msg}">'
|
||||
)
|
||||
|
||||
__repr__ = __str__
|
||||
|
||||
def get_message(self):
|
||||
"""
|
||||
Return the message for this LogRecord after merging any user-supplied
|
||||
arguments with the message.
|
||||
"""
|
||||
msg = str(self.msg)
|
||||
if self.args:
|
||||
msg = msg % self.args
|
||||
return msg
|
||||
|
||||
|
||||
class ExtendedLogRecord(LogRecord):
|
||||
def __init__(
|
||||
self,
|
||||
name: str,
|
||||
level: LogLevel,
|
||||
pathname: str,
|
||||
lineno: int,
|
||||
msg,
|
||||
args: Optional[Tuple[Mapping]],
|
||||
exc_info: Optional[ExceptionInfo],
|
||||
func: Optional[str] = None,
|
||||
sinfo: Optional[str] = None,
|
||||
**kwargs,
|
||||
) -> None:
|
||||
super().__init__(
|
||||
name, level, pathname, lineno, msg, args, exc_info, func, sinfo
|
||||
)
|
||||
self.extra = kwargs["extra"]
|
||||
self.flatten = kwargs["flatten"]
|
||||
self.serializer_kwargs = kwargs["serializer_kwargs"]
|
||||
@@ -0,0 +1,16 @@
|
||||
from os import getenv
|
||||
from typing import Optional
|
||||
|
||||
|
||||
def get_bool_env(name: str, default: Optional[bool] = None) -> bool:
|
||||
value = getenv(name, default)
|
||||
if not value:
|
||||
return False
|
||||
if value in ("False", "false", "0"):
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
HANDLE_ERROR_FALLBACK_ENABLED = get_bool_env(
|
||||
"AIOLOGGER_HANDLE_ERROR_FALLBACK_ENABLED", default=True
|
||||
)
|
||||
123
python3-vckonline/lib/python3.8/site-packages/aiologger/utils.py
Normal file
123
python3-vckonline/lib/python3.8/site-packages/aiologger/utils.py
Normal file
@@ -0,0 +1,123 @@
|
||||
import sys
|
||||
import warnings
|
||||
import functools
|
||||
from asyncio import AbstractEventLoop
|
||||
from typing import Callable, TypeVar, Type, cast
|
||||
|
||||
|
||||
if sys.version_info >= (3, 7):
|
||||
from asyncio import get_running_loop
|
||||
from asyncio import create_task
|
||||
else:
|
||||
from asyncio import _get_running_loop
|
||||
|
||||
def get_running_loop():
|
||||
loop = _get_running_loop()
|
||||
if loop is None:
|
||||
raise RuntimeError("no running event loop")
|
||||
return loop
|
||||
|
||||
def create_task(coro):
|
||||
loop = get_running_loop()
|
||||
return loop.create_task(coro)
|
||||
|
||||
|
||||
_T = TypeVar("_T", bound=Type[object])
|
||||
|
||||
|
||||
class _LoopCompat:
|
||||
__loop = None
|
||||
|
||||
@property
|
||||
def _loop(self) -> AbstractEventLoop:
|
||||
warnings.warn(
|
||||
"The .loop and ._loop attributes are deprecated", DeprecationWarning
|
||||
)
|
||||
loop = self.__loop
|
||||
return get_running_loop() if loop is None else loop
|
||||
|
||||
@property
|
||||
def loop(self) -> AbstractEventLoop:
|
||||
warnings.warn(
|
||||
"The .loop and ._loop attributes are deprecated", DeprecationWarning
|
||||
)
|
||||
return self._loop
|
||||
|
||||
@classmethod
|
||||
def decorate(cls, v: _T) -> _T:
|
||||
@functools.wraps(v.__init__)
|
||||
def __init__(self, *args, **kwargs):
|
||||
try:
|
||||
self.__loop = kwargs.pop("loop")
|
||||
except KeyError:
|
||||
pass
|
||||
else:
|
||||
warnings.warn(
|
||||
"The loop argument is deprecated", DeprecationWarning
|
||||
)
|
||||
__init__.__wrapped__(self, *args, **kwargs)
|
||||
|
||||
v.__init__ = __init__ # type: ignore
|
||||
v.__loop = None # type: ignore
|
||||
_loop = cls._loop
|
||||
loop = cls.loop
|
||||
if not hasattr(v, "_loop"):
|
||||
v._loop = _loop # type: ignore
|
||||
|
||||
if not hasattr(v, "loop"):
|
||||
v.loop = loop # type: ignore
|
||||
|
||||
return v
|
||||
|
||||
|
||||
_F = TypeVar("_F", bound=Callable[..., object])
|
||||
|
||||
if sys.version_info >= (3, 10):
|
||||
|
||||
def loop_compat(v: _T) -> _T:
|
||||
return v
|
||||
|
||||
def bind_loop(v: _F, kwargs: dict) -> _F:
|
||||
return v
|
||||
|
||||
|
||||
else:
|
||||
loop_compat = _LoopCompat.decorate
|
||||
|
||||
def bind_loop(v: _F, kwargs: dict) -> _F:
|
||||
"""
|
||||
bind a loop kwarg, without letting mypy know about it
|
||||
"""
|
||||
try:
|
||||
return cast(_F, functools.partial(v, loop=kwargs["loop"]))
|
||||
except KeyError:
|
||||
pass
|
||||
return v
|
||||
|
||||
|
||||
class classproperty:
|
||||
def __init__(self, func):
|
||||
self._func = func
|
||||
|
||||
def __get__(self, obj, owner):
|
||||
return self._func(owner)
|
||||
|
||||
|
||||
class CallableWrapper:
|
||||
def __init__(self, func: Callable) -> None:
|
||||
self.func = func
|
||||
|
||||
def __call__(self, *args, **kwargs):
|
||||
return self.func(*args, **kwargs)
|
||||
|
||||
|
||||
if hasattr(sys, "_getframe"):
|
||||
get_current_frame = lambda: sys._getframe(3)
|
||||
else: # pragma: no cover
|
||||
|
||||
def get_current_frame():
|
||||
"""Return the frame object for the caller's stack frame."""
|
||||
try:
|
||||
raise Exception
|
||||
except Exception:
|
||||
return sys.exc_info()[2].tb_frame.f_back
|
||||
@@ -0,0 +1 @@
|
||||
pip
|
||||
@@ -0,0 +1,28 @@
|
||||
Copyright 2014 Pallets
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are
|
||||
met:
|
||||
|
||||
1. Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
|
||||
2. Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in the
|
||||
documentation and/or other materials provided with the distribution.
|
||||
|
||||
3. Neither the name of the copyright holder nor the names of its
|
||||
contributors may be used to endorse or promote products derived from
|
||||
this software without specific prior written permission.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
|
||||
PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
|
||||
HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
|
||||
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED
|
||||
TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
|
||||
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
|
||||
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
|
||||
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user