Anosql is abandoned upstream. Vendor it.
This commit is contained in:
parent
f6815f2608
commit
a5aea04c48
38 changed files with 2928 additions and 2 deletions
7
projects/anosql/BUILD
Normal file
7
projects/anosql/BUILD
Normal file
|
@ -0,0 +1,7 @@
|
|||
py_project(
|
||||
name="anosql",
|
||||
test_deps = [
|
||||
py_requirement("pytest-postgresql"),
|
||||
py_requirement("psycopg2"),
|
||||
]
|
||||
)
|
26
projects/anosql/LICENSE
Normal file
26
projects/anosql/LICENSE
Normal file
|
@ -0,0 +1,26 @@
|
|||
Copyright (c) 2014-2017, Honza Pokorny
|
||||
All rights reserved.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are met:
|
||||
|
||||
1. Redistributions of source code must retain the above copyright notice, this
|
||||
list of conditions and the following disclaimer.
|
||||
2. Redistributions in binary form must reproduce the above copyright notice,
|
||||
this list of conditions and the following disclaimer in the documentation
|
||||
and/or other materials provided with the distribution.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
|
||||
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
|
||||
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
|
||||
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
The views and conclusions contained in the software and documentation are those
|
||||
of the authors and should not be interpreted as representing official policies,
|
||||
either expressed or implied, of the FreeBSD Project.
|
252
projects/anosql/README.rst
Normal file
252
projects/anosql/README.rst
Normal file
|
@ -0,0 +1,252 @@
|
|||
anosql
|
||||
======
|
||||
|
||||
**NOTICE**: This project is now deprecated in favor of `aiosql`_.
|
||||
|
||||
Unfortunately, I no longer have the time to devote to this project, and aiosql
|
||||
is now a lot more popular. I don't think it makes sense to maintain both.
|
||||
Open source ftw! Thanks for your hard work, `Will`_!
|
||||
|
||||
.. _aiosql: https://github.com/nackjicholson/aiosql
|
||||
.. _Will: https://github.com/nackjicholson
|
||||
|
||||
.. image:: https://badge.fury.io/py/anosql.svg
|
||||
:target: https://badge.fury.io/py/anosql
|
||||
:alt: pypi package version
|
||||
|
||||
.. image:: http://readthedocs.org/projects/anosql/badge/?version=latest
|
||||
:target: http://anosql.readthedocs.io/en/latest/?badge=latest
|
||||
:alt: Documentation Status
|
||||
|
||||
.. image:: https://travis-ci.org/honza/anosql.svg?branch=master
|
||||
:target: https://travis-ci.org/honza/anosql
|
||||
:alt: Travid build status
|
||||
|
||||
A Python library for using SQL
|
||||
|
||||
Inspired by the excellent `Yesql`_ library by Kris Jenkins. In my mother
|
||||
tongue, *ano* means *yes*.
|
||||
|
||||
If you are on python3.6+ or need ``anosql`` to work with ``asyncio``-based database drivers, see the related project, `aiosql <https://github.com/nackjicholson/aiosql>`_.
|
||||
|
||||
Complete documentation is available at `Read The Docs <https://anosql.readthedocs.io/en/latest/>`_.
|
||||
|
||||
Installation
|
||||
------------
|
||||
|
||||
::
|
||||
|
||||
$ pip install anosql
|
||||
|
||||
Usage
|
||||
-----
|
||||
|
||||
Basics
|
||||
******
|
||||
|
||||
Given a ``queries.sql`` file:
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: get-all-greetings
|
||||
-- Get all the greetings in the database
|
||||
SELECT * FROM greetings;
|
||||
|
||||
-- name: select-users
|
||||
-- Get all the users from the database,
|
||||
-- and return it as a dict
|
||||
SELECT * FROM USERS;
|
||||
|
||||
We can issue SQL queries, like so:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
import anosql
|
||||
import psycopg2
|
||||
import sqlite3
|
||||
|
||||
# PostgreSQL
|
||||
conn = psycopg2.connect('...')
|
||||
queries = anosql.from_path('queries.sql', 'psycopg2')
|
||||
|
||||
# Or, Sqlite3...
|
||||
conn = sqlite3.connect('cool.db')
|
||||
queries = anosql.from_path('queries.sql', 'sqlite3')
|
||||
|
||||
queries.get_all_greetings(conn)
|
||||
# => [(1, 'en', 'Hi')]
|
||||
|
||||
queries.get_all_greetings.__doc__
|
||||
# => Get all the greetings in the database
|
||||
|
||||
queries.get_all_greetings.sql
|
||||
# => SELECT * FROM greetings;
|
||||
|
||||
queries.available_queries
|
||||
# => ['get_all_greetings']
|
||||
|
||||
|
||||
Parameters
|
||||
**********
|
||||
|
||||
Often, you want to change parts of the query dynamically, particularly values in
|
||||
the ``WHERE`` clause. You can use parameters to do this:
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: get-greetings-for-language
|
||||
-- Get all the greetings in the database for given language
|
||||
SELECT *
|
||||
FROM greetings
|
||||
WHERE lang = %s;
|
||||
|
||||
And they become positional parameters:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
visitor_language = "en"
|
||||
queries.get_greetings_for_language(conn, visitor_language)
|
||||
# => [(1, 'en', 'Hi')]
|
||||
|
||||
|
||||
One Row Query
|
||||
*************
|
||||
|
||||
Often, you would expect at most one row from a query, so that getting a list
|
||||
is not convenient. Appending ``?`` to the query name makes it return either one
|
||||
tuple if it returned one row, or ``None`` in other cases.
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: get-a-greeting?
|
||||
-- Get a greeting based on its id
|
||||
SELECT *
|
||||
FROM greetings
|
||||
WHERE id = %s;
|
||||
|
||||
Then a tuple is returned:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
queries.get_a_greeting(conn, 1)
|
||||
# => (1, 'en', 'Hi')
|
||||
|
||||
|
||||
Named Parameters
|
||||
****************
|
||||
|
||||
To make queries with many parameters more understandable and maintainable, you
|
||||
can give the parameters names:
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: get-greetings-for-language-and-length
|
||||
-- Get all the greetings in the database for given language and length
|
||||
SELECT *
|
||||
FROM greetings
|
||||
WHERE lang = :lang
|
||||
AND len(greeting) <= :length_limit;
|
||||
|
||||
If you were writing a Postgresql query, you could also format the parameters as
|
||||
``%s(lang)`` and ``%s(length_limit)``.
|
||||
|
||||
Then, call your queries like you would any Python function with named
|
||||
parameters:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
visitor_language = "en"
|
||||
|
||||
greetings_for_texting = queries.get_greetings_for_language_and_length(
|
||||
conn, lang=visitor_language, length_limit=140)
|
||||
|
||||
Update/Insert/Delete
|
||||
********************
|
||||
|
||||
In order to run ``UPDATE``, ``INSERT``, or ``DELETE`` statements, you need to
|
||||
add ``!`` to the end of your query name. Anosql will then execute it properly.
|
||||
It will also return the number of affected rows.
|
||||
|
||||
Insert queries returning autogenerated values
|
||||
*********************************************
|
||||
|
||||
If you want the auto-generated primary key to be returned after you run an
|
||||
insert query, you can add ``<!`` to the end of your query name.
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: create-user<!
|
||||
INSERT INTO person (name) VALUES (:name)
|
||||
|
||||
Adding custom query loaders.
|
||||
****************************
|
||||
|
||||
Out of the box, ``anosql`` supports SQLite and PostgreSQL via the stdlib ``sqlite3`` database driver
|
||||
and ``psycopg2``. If you would like to extend ``anosql`` to communicate with other types of databases,
|
||||
you may create a driver adapter class and register it with ``anosql.core.register_driver_adapter()``.
|
||||
|
||||
Driver adapters are duck-typed classes which adhere to the below interface. Looking at ``anosql/adapters`` package
|
||||
is a good place to get started by looking at how the ``psycopg2`` and ``sqlite3`` adapters work.
|
||||
|
||||
To register a new loader::
|
||||
|
||||
import anosql
|
||||
import anosql.core
|
||||
|
||||
class MyDbAdapter():
|
||||
def process_sql(self, name, op_type, sql):
|
||||
pass
|
||||
|
||||
def select(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
@contextmanager
|
||||
def select_cursor(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def insert_update_delete(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def insert_update_delete_many(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def insert_returning(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def execute_script(self, conn, sql):
|
||||
pass
|
||||
|
||||
|
||||
anosql.core.register_driver_adapter("mydb", MyDbAdapter)
|
||||
|
||||
# To use make a connection to your db, and pass "mydb" as the db_type:
|
||||
import mydbdriver
|
||||
conn = mydbriver.connect("...")
|
||||
|
||||
anosql.load_queries("path/to/sql/", "mydb")
|
||||
greetings = anosql.get_greetings(conn)
|
||||
|
||||
conn.close()
|
||||
|
||||
If your adapter constructor takes arguments, you can register a function which can build
|
||||
your adapter instance::
|
||||
|
||||
def adapter_factory():
|
||||
return MyDbAdapter("foo", 42)
|
||||
|
||||
anosql.register_driver_adapter("mydb", adapter_factory)
|
||||
|
||||
Tests
|
||||
-----
|
||||
|
||||
::
|
||||
|
||||
$ pip install tox
|
||||
$ tox
|
||||
|
||||
License
|
||||
-------
|
||||
|
||||
BSD, short and sweet
|
||||
|
||||
.. _Yesql: https://github.com/krisajenkins/yesql/
|
225
projects/anosql/doc/Makefile
Normal file
225
projects/anosql/doc/Makefile
Normal file
|
@ -0,0 +1,225 @@
|
|||
# Makefile for Sphinx documentation
|
||||
#
|
||||
|
||||
# You can set these variables from the command line.
|
||||
SPHINXOPTS =
|
||||
SPHINXBUILD = sphinx-build
|
||||
PAPER =
|
||||
BUILDDIR = _build
|
||||
|
||||
# Internal variables.
|
||||
PAPEROPT_a4 = -D latex_paper_size=a4
|
||||
PAPEROPT_letter = -D latex_paper_size=letter
|
||||
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||
# the i18n builder cannot share the environment and doctrees with the others
|
||||
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||
|
||||
.PHONY: help
|
||||
help:
|
||||
@echo "Please use \`make <target>' where <target> is one of"
|
||||
@echo " html to make standalone HTML files"
|
||||
@echo " dirhtml to make HTML files named index.html in directories"
|
||||
@echo " singlehtml to make a single large HTML file"
|
||||
@echo " pickle to make pickle files"
|
||||
@echo " json to make JSON files"
|
||||
@echo " htmlhelp to make HTML files and a HTML help project"
|
||||
@echo " qthelp to make HTML files and a qthelp project"
|
||||
@echo " applehelp to make an Apple Help Book"
|
||||
@echo " devhelp to make HTML files and a Devhelp project"
|
||||
@echo " epub to make an epub"
|
||||
@echo " epub3 to make an epub3"
|
||||
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
|
||||
@echo " latexpdf to make LaTeX files and run them through pdflatex"
|
||||
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
|
||||
@echo " text to make text files"
|
||||
@echo " man to make manual pages"
|
||||
@echo " texinfo to make Texinfo files"
|
||||
@echo " info to make Texinfo files and run them through makeinfo"
|
||||
@echo " gettext to make PO message catalogs"
|
||||
@echo " changes to make an overview of all changed/added/deprecated items"
|
||||
@echo " xml to make Docutils-native XML files"
|
||||
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
|
||||
@echo " linkcheck to check all external links for integrity"
|
||||
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
|
||||
@echo " coverage to run coverage check of the documentation (if enabled)"
|
||||
@echo " dummy to check syntax errors of document sources"
|
||||
|
||||
.PHONY: clean
|
||||
clean:
|
||||
rm -rf $(BUILDDIR)/*
|
||||
|
||||
.PHONY: html
|
||||
html:
|
||||
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
|
||||
|
||||
.PHONY: dirhtml
|
||||
dirhtml:
|
||||
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
|
||||
|
||||
.PHONY: singlehtml
|
||||
singlehtml:
|
||||
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
|
||||
@echo
|
||||
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
|
||||
|
||||
.PHONY: pickle
|
||||
pickle:
|
||||
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
|
||||
@echo
|
||||
@echo "Build finished; now you can process the pickle files."
|
||||
|
||||
.PHONY: json
|
||||
json:
|
||||
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
|
||||
@echo
|
||||
@echo "Build finished; now you can process the JSON files."
|
||||
|
||||
.PHONY: htmlhelp
|
||||
htmlhelp:
|
||||
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
|
||||
@echo
|
||||
@echo "Build finished; now you can run HTML Help Workshop with the" \
|
||||
".hhp project file in $(BUILDDIR)/htmlhelp."
|
||||
|
||||
.PHONY: qthelp
|
||||
qthelp:
|
||||
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
|
||||
@echo
|
||||
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
|
||||
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
|
||||
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/anosql.qhcp"
|
||||
@echo "To view the help file:"
|
||||
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/anosql.qhc"
|
||||
|
||||
.PHONY: applehelp
|
||||
applehelp:
|
||||
$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
|
||||
@echo
|
||||
@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
|
||||
@echo "N.B. You won't be able to view it unless you put it in" \
|
||||
"~/Library/Documentation/Help or install it in your application" \
|
||||
"bundle."
|
||||
|
||||
.PHONY: devhelp
|
||||
devhelp:
|
||||
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
|
||||
@echo
|
||||
@echo "Build finished."
|
||||
@echo "To view the help file:"
|
||||
@echo "# mkdir -p $$HOME/.local/share/devhelp/anosql"
|
||||
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/anosql"
|
||||
@echo "# devhelp"
|
||||
|
||||
.PHONY: epub
|
||||
epub:
|
||||
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
|
||||
@echo
|
||||
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
|
||||
|
||||
.PHONY: epub3
|
||||
epub3:
|
||||
$(SPHINXBUILD) -b epub3 $(ALLSPHINXOPTS) $(BUILDDIR)/epub3
|
||||
@echo
|
||||
@echo "Build finished. The epub3 file is in $(BUILDDIR)/epub3."
|
||||
|
||||
.PHONY: latex
|
||||
latex:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||
@echo
|
||||
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
|
||||
@echo "Run \`make' in that directory to run these through (pdf)latex" \
|
||||
"(use \`make latexpdf' here to do that automatically)."
|
||||
|
||||
.PHONY: latexpdf
|
||||
latexpdf:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||
@echo "Running LaTeX files through pdflatex..."
|
||||
$(MAKE) -C $(BUILDDIR)/latex all-pdf
|
||||
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
||||
|
||||
.PHONY: latexpdfja
|
||||
latexpdfja:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||
@echo "Running LaTeX files through platex and dvipdfmx..."
|
||||
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
|
||||
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
||||
|
||||
.PHONY: text
|
||||
text:
|
||||
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
|
||||
@echo
|
||||
@echo "Build finished. The text files are in $(BUILDDIR)/text."
|
||||
|
||||
.PHONY: man
|
||||
man:
|
||||
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
|
||||
@echo
|
||||
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
|
||||
|
||||
.PHONY: texinfo
|
||||
texinfo:
|
||||
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||
@echo
|
||||
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
|
||||
@echo "Run \`make' in that directory to run these through makeinfo" \
|
||||
"(use \`make info' here to do that automatically)."
|
||||
|
||||
.PHONY: info
|
||||
info:
|
||||
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||
@echo "Running Texinfo files through makeinfo..."
|
||||
make -C $(BUILDDIR)/texinfo info
|
||||
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
|
||||
|
||||
.PHONY: gettext
|
||||
gettext:
|
||||
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
|
||||
@echo
|
||||
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
|
||||
|
||||
.PHONY: changes
|
||||
changes:
|
||||
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
|
||||
@echo
|
||||
@echo "The overview file is in $(BUILDDIR)/changes."
|
||||
|
||||
.PHONY: linkcheck
|
||||
linkcheck:
|
||||
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
|
||||
@echo
|
||||
@echo "Link check complete; look for any errors in the above output " \
|
||||
"or in $(BUILDDIR)/linkcheck/output.txt."
|
||||
|
||||
.PHONY: doctest
|
||||
doctest:
|
||||
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
|
||||
@echo "Testing of doctests in the sources finished, look at the " \
|
||||
"results in $(BUILDDIR)/doctest/output.txt."
|
||||
|
||||
.PHONY: coverage
|
||||
coverage:
|
||||
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
|
||||
@echo "Testing of coverage in the sources finished, look at the " \
|
||||
"results in $(BUILDDIR)/coverage/python.txt."
|
||||
|
||||
.PHONY: xml
|
||||
xml:
|
||||
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
|
||||
@echo
|
||||
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
|
||||
|
||||
.PHONY: pseudoxml
|
||||
pseudoxml:
|
||||
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
|
||||
@echo
|
||||
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
|
||||
|
||||
.PHONY: dummy
|
||||
dummy:
|
||||
$(SPHINXBUILD) -b dummy $(ALLSPHINXOPTS) $(BUILDDIR)/dummy
|
||||
@echo
|
||||
@echo "Build finished. Dummy builder generates no files."
|
339
projects/anosql/doc/conf.py
Normal file
339
projects/anosql/doc/conf.py
Normal file
|
@ -0,0 +1,339 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# anosql documentation build configuration file, created by
|
||||
# sphinx-quickstart on Mon Jul 25 09:16:20 2016.
|
||||
#
|
||||
# This file is execfile()d with the current directory set to its
|
||||
# containing dir.
|
||||
#
|
||||
# Note that not all possible configuration values are present in this
|
||||
# autogenerated file.
|
||||
#
|
||||
# All configuration values have a default; values that are commented out
|
||||
# serve to show the default.
|
||||
|
||||
# If extensions (or modules to document with autodoc) are in another directory,
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
# import os
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
import pkg_resources
|
||||
|
||||
# -- General configuration ------------------------------------------------
|
||||
|
||||
# If your documentation needs a minimal Sphinx version, state it here.
|
||||
#
|
||||
# needs_sphinx = '1.0'
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
|
||||
# ones.
|
||||
extensions = ["sphinx.ext.autodoc", "sphinx.ext.napoleon"]
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ['_templates']
|
||||
|
||||
# The suffix(es) of source filenames.
|
||||
# You can specify multiple suffix as a list of string:
|
||||
#
|
||||
# source_suffix = ['.rst', '.md']
|
||||
source_suffix = '.rst'
|
||||
|
||||
# The encoding of source files.
|
||||
#
|
||||
# source_encoding = 'utf-8-sig'
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'index'
|
||||
|
||||
# General information about the project.
|
||||
project = u'anosql'
|
||||
copyright = u'2014-2017, Honza Pokorny'
|
||||
author = u'Honza Pokorny'
|
||||
|
||||
# The version info for the project you're documenting, acts as replacement for
|
||||
# |version| and |release|, also used in various other places throughout the
|
||||
# built documents.
|
||||
#
|
||||
# The short X.Y version.
|
||||
version = pkg_resources.get_distribution('anosql').version
|
||||
# The full version, including alpha/beta/rc tags.
|
||||
release = version
|
||||
|
||||
# The language for content autogenerated by Sphinx. Refer to documentation
|
||||
# for a list of supported languages.
|
||||
#
|
||||
# This is also used if you do content translation via gettext catalogs.
|
||||
# Usually you set "language" from the command line for these cases.
|
||||
language = None
|
||||
|
||||
# There are two options for replacing |today|: either, you set today to some
|
||||
# non-false value, then it is used:
|
||||
#
|
||||
# today = ''
|
||||
#
|
||||
# Else, today_fmt is used as the format for a strftime call.
|
||||
#
|
||||
# today_fmt = '%B %d, %Y'
|
||||
|
||||
# List of patterns, relative to source directory, that match files and
|
||||
# directories to ignore when looking for source files.
|
||||
# This patterns also effect to html_static_path and html_extra_path
|
||||
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
|
||||
|
||||
# The reST default role (used for this markup: `text`) to use for all
|
||||
# documents.
|
||||
#
|
||||
# default_role = None
|
||||
|
||||
# If true, '()' will be appended to :func: etc. cross-reference text.
|
||||
#
|
||||
# add_function_parentheses = True
|
||||
|
||||
# If true, the current module name will be prepended to all description
|
||||
# unit titles (such as .. function::).
|
||||
#
|
||||
# add_module_names = True
|
||||
|
||||
# If true, sectionauthor and moduleauthor directives will be shown in the
|
||||
# output. They are ignored by default.
|
||||
#
|
||||
# show_authors = False
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'sphinx'
|
||||
|
||||
# A list of ignored prefixes for module index sorting.
|
||||
# modindex_common_prefix = []
|
||||
|
||||
# If true, keep warnings as "system message" paragraphs in the built documents.
|
||||
# keep_warnings = False
|
||||
|
||||
# If true, `todo` and `todoList` produce output, else they produce nothing.
|
||||
todo_include_todos = False
|
||||
|
||||
|
||||
# -- Options for HTML output ----------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||
# a list of builtin themes.
|
||||
#
|
||||
html_theme = 'alabaster'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
# documentation.
|
||||
#
|
||||
# html_theme_options = {}
|
||||
|
||||
# Add any paths that contain custom themes here, relative to this directory.
|
||||
# html_theme_path = []
|
||||
|
||||
# The name for this set of Sphinx documents.
|
||||
# "<project> v<release> documentation" by default.
|
||||
#
|
||||
# html_title = u'anosql v0.1.2'
|
||||
|
||||
# A shorter title for the navigation bar. Default is the same as html_title.
|
||||
#
|
||||
# html_short_title = None
|
||||
|
||||
# The name of an image file (relative to this directory) to place at the top
|
||||
# of the sidebar.
|
||||
#
|
||||
# html_logo = None
|
||||
|
||||
# The name of an image file (relative to this directory) to use as a favicon of
|
||||
# the docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
|
||||
# pixels large.
|
||||
#
|
||||
# html_favicon = None
|
||||
|
||||
# Add any paths that contain custom static files (such as style sheets) here,
|
||||
# relative to this directory. They are copied after the builtin static files,
|
||||
# so a file named "default.css" will overwrite the builtin "default.css".
|
||||
html_static_path = []
|
||||
|
||||
# Add any extra paths that contain custom files (such as robots.txt or
|
||||
# .htaccess) here, relative to this directory. These files are copied
|
||||
# directly to the root of the documentation.
|
||||
#
|
||||
# html_extra_path = []
|
||||
|
||||
# If not None, a 'Last updated on:' timestamp is inserted at every page
|
||||
# bottom, using the given strftime format.
|
||||
# The empty string is equivalent to '%b %d, %Y'.
|
||||
#
|
||||
# html_last_updated_fmt = None
|
||||
|
||||
# If true, SmartyPants will be used to convert quotes and dashes to
|
||||
# typographically correct entities.
|
||||
#
|
||||
# html_use_smartypants = True
|
||||
|
||||
# Custom sidebar templates, maps document names to template names.
|
||||
#
|
||||
# html_sidebars = {}
|
||||
|
||||
# Additional templates that should be rendered to pages, maps page names to
|
||||
# template names.
|
||||
#
|
||||
# html_additional_pages = {}
|
||||
|
||||
# If false, no module index is generated.
|
||||
#
|
||||
# html_domain_indices = True
|
||||
|
||||
# If false, no index is generated.
|
||||
#
|
||||
# html_use_index = True
|
||||
|
||||
# If true, the index is split into individual pages for each letter.
|
||||
#
|
||||
# html_split_index = False
|
||||
|
||||
# If true, links to the reST sources are added to the pages.
|
||||
#
|
||||
# html_show_sourcelink = True
|
||||
|
||||
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
|
||||
#
|
||||
# html_show_sphinx = True
|
||||
|
||||
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
|
||||
#
|
||||
# html_show_copyright = True
|
||||
|
||||
# If true, an OpenSearch description file will be output, and all pages will
|
||||
# contain a <link> tag referring to it. The value of this option must be the
|
||||
# base URL from which the finished HTML is served.
|
||||
#
|
||||
# html_use_opensearch = ''
|
||||
|
||||
# This is the file name suffix for HTML files (e.g. ".xhtml").
|
||||
# html_file_suffix = None
|
||||
|
||||
# Language to be used for generating the HTML full-text search index.
|
||||
# Sphinx supports the following languages:
|
||||
# 'da', 'de', 'en', 'es', 'fi', 'fr', 'hu', 'it', 'ja'
|
||||
# 'nl', 'no', 'pt', 'ro', 'ru', 'sv', 'tr', 'zh'
|
||||
#
|
||||
# html_search_language = 'en'
|
||||
|
||||
# A dictionary with options for the search language support, empty by default.
|
||||
# 'ja' uses this config value.
|
||||
# 'zh' user can custom change `jieba` dictionary path.
|
||||
#
|
||||
# html_search_options = {'type': 'default'}
|
||||
|
||||
# The name of a javascript file (relative to the configuration directory) that
|
||||
# implements a search results scorer. If empty, the default will be used.
|
||||
#
|
||||
# html_search_scorer = 'scorer.js'
|
||||
|
||||
# Output file base name for HTML help builder.
|
||||
htmlhelp_basename = 'anosqldoc'
|
||||
|
||||
# -- Options for LaTeX output ---------------------------------------------
|
||||
|
||||
latex_elements = {
|
||||
# The paper size ('letterpaper' or 'a4paper').
|
||||
#
|
||||
# 'papersize': 'letterpaper',
|
||||
|
||||
# The font size ('10pt', '11pt' or '12pt').
|
||||
#
|
||||
# 'pointsize': '10pt',
|
||||
|
||||
# Additional stuff for the LaTeX preamble.
|
||||
#
|
||||
# 'preamble': '',
|
||||
|
||||
# Latex figure (float) alignment
|
||||
#
|
||||
# 'figure_align': 'htbp',
|
||||
}
|
||||
|
||||
# Grouping the document tree into LaTeX files. List of tuples
|
||||
# (source start file, target name, title,
|
||||
# author, documentclass [howto, manual, or own class]).
|
||||
latex_documents = [
|
||||
(master_doc, 'anosql.tex', u'anosql Documentation',
|
||||
u'Honza Pokorny', 'manual'),
|
||||
]
|
||||
|
||||
# The name of an image file (relative to this directory) to place at the top of
|
||||
# the title page.
|
||||
#
|
||||
# latex_logo = None
|
||||
|
||||
# For "manual" documents, if this is true, then toplevel headings are parts,
|
||||
# not chapters.
|
||||
#
|
||||
# latex_use_parts = False
|
||||
|
||||
# If true, show page references after internal links.
|
||||
#
|
||||
# latex_show_pagerefs = False
|
||||
|
||||
# If true, show URL addresses after external links.
|
||||
#
|
||||
# latex_show_urls = False
|
||||
|
||||
# Documents to append as an appendix to all manuals.
|
||||
#
|
||||
# latex_appendices = []
|
||||
|
||||
# It false, will not define \strong, \code, itleref, \crossref ... but only
|
||||
# \sphinxstrong, ..., \sphinxtitleref, ... To help avoid clash with user added
|
||||
# packages.
|
||||
#
|
||||
# latex_keep_old_macro_names = True
|
||||
|
||||
# If false, no module index is generated.
|
||||
#
|
||||
# latex_domain_indices = True
|
||||
|
||||
|
||||
# -- Options for manual page output ---------------------------------------
|
||||
|
||||
# One entry per manual page. List of tuples
|
||||
# (source start file, name, description, authors, manual section).
|
||||
man_pages = [
|
||||
(master_doc, 'anosql', u'anosql Documentation',
|
||||
[author], 1)
|
||||
]
|
||||
|
||||
# If true, show URL addresses after external links.
|
||||
#
|
||||
# man_show_urls = False
|
||||
|
||||
|
||||
# -- Options for Texinfo output -------------------------------------------
|
||||
|
||||
# Grouping the document tree into Texinfo files. List of tuples
|
||||
# (source start file, target name, title, author,
|
||||
# dir menu entry, description, category)
|
||||
texinfo_documents = [
|
||||
(master_doc, 'anosql', u'anosql Documentation',
|
||||
author, 'anosql', 'One line description of project.',
|
||||
'Miscellaneous'),
|
||||
]
|
||||
|
||||
# Documents to append as an appendix to all manuals.
|
||||
#
|
||||
# texinfo_appendices = []
|
||||
|
||||
# If false, no module index is generated.
|
||||
#
|
||||
# texinfo_domain_indices = True
|
||||
|
||||
# How to display URL addresses: 'footnote', 'no', or 'inline'.
|
||||
#
|
||||
# texinfo_show_urls = 'footnote'
|
||||
|
||||
# If true, do not generate a @detailmenu in the "Top" node's menu.
|
||||
#
|
||||
# texinfo_no_detailmenu = False
|
185
projects/anosql/doc/defining_queries.rst
Normal file
185
projects/anosql/doc/defining_queries.rst
Normal file
|
@ -0,0 +1,185 @@
|
|||
####################
|
||||
Defining SQL Queries
|
||||
####################
|
||||
|
||||
Query Names & Comments
|
||||
======================
|
||||
|
||||
Name definitions are how ``anosql`` determines how to name the SQL code blocks which are loaded.
|
||||
A query name definition is a normal SQL comment starting with "\-\- name:" and is followed by the
|
||||
name of the query. You can use ``-`` or ``_`` in your query names, but the methods in Python
|
||||
will always be valid Python names using underscores.
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: get-all-blogs
|
||||
select * from blogs;
|
||||
|
||||
The above example when loaded by ``anosql.from_path`` will return an object with a
|
||||
``.get_all_blogs(conn)`` method.
|
||||
|
||||
Your SQL comments will be added to your methods as Python docstrings, and accessible by calling
|
||||
``help()`` on them.
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: get-all-blogs
|
||||
-- Fetch all fields for every blog in the database.
|
||||
select * from blogs;
|
||||
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
queries = anosql.from_path("blogs.sql", "sqlite3")
|
||||
help(anosql.get_all_blogs)
|
||||
|
||||
returns
|
||||
|
||||
.. code-block:: text
|
||||
|
||||
Help on function get_user_blogs in module anosql.anosql:
|
||||
|
||||
get_all_blogs(conn, *args, **kwargs)
|
||||
Fetch all fields for every blog in the database.
|
||||
|
||||
.. _query-operations:
|
||||
|
||||
Query Operations
|
||||
================
|
||||
|
||||
Adding query operator symbols to the end of query names will inform ``anosql`` of how to
|
||||
execute and return results. In the above section the ``get-all-blogs`` name has no special operator
|
||||
characters trailing it. This lack of operator is actually the most basic operator which performs
|
||||
SQL ``select`` statements and returns a list of rows. When writing an application you will often
|
||||
need to perform other operations besides selects, like inserts, deletes, and bulk opearations. The
|
||||
operators detailed in this section let you declare in your SQL how your code should be executed
|
||||
by the database driver.
|
||||
|
||||
Insert/Update/Delete with ``!``
|
||||
-------------------------------
|
||||
|
||||
The ``!`` operator will execute SQL without returning any results. It is meant for use with ``insert``,
|
||||
``update``, and ``delete`` statements for which returned data is not required.
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: publish-blog!
|
||||
insert into blogs(userid, title, content) values (:userid, :title, :content);
|
||||
|
||||
-- name: remove-blog!
|
||||
-- Remove a blog from the database
|
||||
delete from blogs where blogid = :blogid;
|
||||
|
||||
|
||||
The methods generated are:
|
||||
|
||||
- ``publish_blog(conn, *args, **kwargs)``
|
||||
- ``remove_blog(conn, *args, **kwargs)``
|
||||
|
||||
Each of them can be run to alter the database, but both will return ``None``.
|
||||
|
||||
Insert Returning with ``<!``
|
||||
----------------------------
|
||||
|
||||
Sometimes when performing an insert it is necessary to receive some information back about the
|
||||
newly created database row. The ``<!`` operator tells anosql to perform execute the insert query, but to also expect and
|
||||
return some data.
|
||||
|
||||
In SQLite this means the ``cur.lastrowid`` will be returned.
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: publish-blog<!
|
||||
insert into blogs(userid, title, content) values (:userid, :title, :content);
|
||||
|
||||
Will return the ``blogid`` of the inserted row.
|
||||
|
||||
PostgreSQL however allows returning multiple values via the ``returning`` clause of insert
|
||||
queries.
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: publish-blog<!
|
||||
insert into blogs (
|
||||
userid,
|
||||
title,
|
||||
content
|
||||
)
|
||||
values (
|
||||
:userid,
|
||||
:title,
|
||||
:content
|
||||
)
|
||||
returning blogid, title;
|
||||
|
||||
This will insert the new blog row and return both it's ``blogid`` and ``title`` value as follows::
|
||||
|
||||
queries = anosql.from_path("blogs.sql", "psycopg2")
|
||||
blogid, title = queries.publish_blog(conn, userid=1, title="Hi", content="word.")
|
||||
|
||||
Insert/Update/Delete Many with ``*!``
|
||||
-------------------------------------
|
||||
|
||||
The DB-API 2.0 drivers like ``sqlite3`` and ``psycopg2`` have an ``executemany`` method which
|
||||
execute a SQL command against all parameter sequences or mappings found in a sequence. This
|
||||
is useful for bulk updates to the database. The below example is a PostgreSQL statement to insert
|
||||
many blog rows.
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: bulk-publish*!
|
||||
-- Insert many blogs at once
|
||||
insert into blogs (
|
||||
userid,
|
||||
title,
|
||||
content,
|
||||
published
|
||||
)
|
||||
values (
|
||||
:userid,
|
||||
:title,
|
||||
:content,
|
||||
:published
|
||||
)
|
||||
|
||||
Applying this to a list of blogs in Python::
|
||||
|
||||
queries = anosql.from_path("blogs.sql", "psycopg2")
|
||||
blogs = [
|
||||
{"userid": 1, "title": "First Blog", "content": "...", published: datetime(2018, 1, 1)},
|
||||
{"userid": 1, "title": "Next Blog", "content": "...", published: datetime(2018, 1, 2)},
|
||||
{"userid": 2, "title": "Hey, Hey!", "content": "...", published: datetime(2018, 7, 28)},
|
||||
]
|
||||
queries.bulk_publish(conn, blogs)
|
||||
|
||||
Execute SQL script statements with ``#``
|
||||
---------------------------------------------
|
||||
|
||||
Executes some SQL statements as a script. These methods don't do variable substitution, or return
|
||||
any rows. An example use case is using data definition statements like `create` table in order to
|
||||
setup your database.
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: create-schema#
|
||||
create table users (
|
||||
userid integer not null primary key,
|
||||
username text not null,
|
||||
firstname integer not null,
|
||||
lastname text not null
|
||||
);
|
||||
|
||||
create table blogs (
|
||||
blogid integer not null primary key,
|
||||
userid integer not null,
|
||||
title text not null,
|
||||
content text not null,
|
||||
published date not null default CURRENT_DATE,
|
||||
foreign key(userid) references users(userid)
|
||||
);
|
||||
|
||||
From code::
|
||||
|
||||
queries = anosql.from_path("create_schema.sql", "sqlite3")
|
||||
queries.create_schema(conn)
|
||||
|
50
projects/anosql/doc/extending.rst
Normal file
50
projects/anosql/doc/extending.rst
Normal file
|
@ -0,0 +1,50 @@
|
|||
.. _extending-anosql:
|
||||
|
||||
################
|
||||
Extending anosql
|
||||
################
|
||||
|
||||
.. _driver-adapters:
|
||||
|
||||
Driver Adapters
|
||||
---------------
|
||||
|
||||
Database driver adapters in ``anosql`` are duck-typed classes which follow the below interface.::
|
||||
|
||||
class MyDbAdapter():
|
||||
def process_sql(self, name, op_type, sql):
|
||||
pass
|
||||
|
||||
def select(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
@contextmanager
|
||||
def select_cursor(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def insert_update_delete(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def insert_update_delete_many(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def insert_returning(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def execute_script(self, conn, sql):
|
||||
pass
|
||||
|
||||
|
||||
anosql.core.register_driver_adapter("mydb", MyDbAdapter)
|
||||
|
||||
If your adapter constructor takes arguments you can register a function which can build
|
||||
your adapter instance::
|
||||
|
||||
def adapter_factory():
|
||||
return MyDbAdapter("foo", 42)
|
||||
|
||||
anosql.core.register_driver_adapter("mydb", adapter_factory)
|
||||
|
||||
Looking at the source of the builtin
|
||||
`adapters/ <https://github.com/honza/anosql/tree/master/anosql/adapters>`_ is a great place
|
||||
to start seeing how you may write your own database driver adapter.
|
56
projects/anosql/doc/getting_started.rst
Normal file
56
projects/anosql/doc/getting_started.rst
Normal file
|
@ -0,0 +1,56 @@
|
|||
###############
|
||||
Getting Started
|
||||
###############
|
||||
|
||||
Below is an example of a program which can print ``"{greeting}, {world_name}!"`` from data held in a minimal SQLite
|
||||
database containing greetings and worlds.
|
||||
|
||||
The SQL is in a ``greetings.sql`` file with ``-- name:`` definitions on each query to tell ``anosql`` under which name
|
||||
we would like to be able to execute them. For example, the query under the name ``get-all-greetings`` in the example
|
||||
below will be available to us after loading via ``anosql.from_path`` as a method ``get_all_greetings(conn)``.
|
||||
Each method on an ``anosql.Queries`` object accepts a database connection to use in communicating with the database.
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: get-all-greetings
|
||||
-- Get all the greetings in the database
|
||||
select greeting_id, greeting from greetings;
|
||||
|
||||
-- name: get-worlds-by-name
|
||||
-- Get all the world record from the database.
|
||||
select world_id,
|
||||
world_name,
|
||||
location
|
||||
from worlds
|
||||
where world_name = :world_name;
|
||||
|
||||
By specifying ``db_driver="sqlite3"`` we can use the Python stdlib ``sqlite3`` driver to execute these SQL queries and
|
||||
get the results. We're also using the ``sqlite3.Row`` type for our records to make it easy to access our data via
|
||||
their column names rather than as tuple indices.
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
import sqlite3
|
||||
import anosql
|
||||
|
||||
queries = anosql.from_path("greetings.sql", db_driver="sqlite3")
|
||||
conn = sqlite3.connect("greetings.db")
|
||||
conn.row_factory = sqlite3.Row
|
||||
|
||||
greetings = queries.get_greetings(conn)
|
||||
worlds = queries.get_worlds_by_name(conn, world_name="Earth")
|
||||
# greetings = [
|
||||
# <Row greeting_id=1, greeting="Hi">,
|
||||
# <Row greeting_id=2, greeting="Aloha">,
|
||||
# <Row greeting_id=3, greeting="Hola">
|
||||
# ]
|
||||
# worlds = [<Row world_id=1, world_name="Earth">]
|
||||
|
||||
for world_row in worlds:
|
||||
for greeting_row in greetings:
|
||||
print(f"{greeting_row['greeting']}, {world_row['world_name']}!")
|
||||
# Hi, Earth!
|
||||
# Aloha, Earth!
|
||||
# Hola, Earth!
|
||||
|
||||
conn.close()
|
142
projects/anosql/doc/index.rst
Normal file
142
projects/anosql/doc/index.rst
Normal file
|
@ -0,0 +1,142 @@
|
|||
.. anosql documentation master file, created by
|
||||
sphinx-quickstart on Mon Jul 25 09:16:20 2016.
|
||||
You can adapt this file completely to your liking, but it should at least
|
||||
contain the root `toctree` directive.
|
||||
|
||||
Welcome to anosql's documentation!
|
||||
==================================
|
||||
|
||||
.. image:: https://badge.fury.io/py/anosql.svg
|
||||
:target: https://badge.fury.io/py/anosql
|
||||
:alt: pypi package version
|
||||
|
||||
.. image:: http://readthedocs.org/projects/anosql/badge/?version=latest
|
||||
:target: http://anosql.readthedocs.io/en/latest/?badge=latest
|
||||
:alt: Documentation status
|
||||
|
||||
.. image:: https://travis-ci.org/honza/anosql.svg?branch=master
|
||||
:target: https://travis-ci.org/honza/anosql
|
||||
:alt: Travis build status
|
||||
|
||||
A Python library for using SQL
|
||||
|
||||
Inspired by the excellent `Yesql`_ library by Kris Jenkins. In my mother
|
||||
tongue, *ano* means *yes*.
|
||||
|
||||
If you are on python3.6+ or need ``anosql`` to work with ``asyncio`` based database drivers, see the related project `aiosql <https://github.com/nackjicholson/aiosql>`_.
|
||||
|
||||
Installation
|
||||
------------
|
||||
|
||||
::
|
||||
|
||||
$ pip install anosql
|
||||
|
||||
Usage
|
||||
-----
|
||||
|
||||
Basics
|
||||
******
|
||||
|
||||
Given a ``queries.sql`` file:
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: get-all-greetings
|
||||
-- Get all the greetings in the database
|
||||
SELECT * FROM greetings;
|
||||
|
||||
We can issue SQL queries, like so:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
import anosql
|
||||
import psycopg2
|
||||
import sqlite3
|
||||
|
||||
# PostgreSQL
|
||||
conn = psycopg2.connect('...')
|
||||
queries = anosql.from_path('queries.sql', 'psycopg2')
|
||||
|
||||
# Or, Sqlite3...
|
||||
conn = sqlite3.connect('cool.db')
|
||||
queries = anosql.from_path('queries.sql', 'sqlite3')
|
||||
|
||||
queries.get_all_greetings(conn)
|
||||
# => [(1, 'Hi')]
|
||||
|
||||
queries.get_all_greetings.__doc__
|
||||
# => Get all the greetings in the database
|
||||
|
||||
queries.get_all_greetings.sql
|
||||
# => SELECT * FROM greetings;
|
||||
|
||||
queries.available_queries
|
||||
# => ['get_all_greetings']
|
||||
|
||||
|
||||
Parameters
|
||||
**********
|
||||
|
||||
Often, you want to change parts of the query dynamically, particularly values in the ``WHERE`` clause.
|
||||
You can use parameters to do this:
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: get-greetings-for-language
|
||||
-- Get all the greetings in the database for a given language
|
||||
SELECT *
|
||||
FROM greetings
|
||||
WHERE lang = %s;
|
||||
|
||||
And they become positional parameters:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
visitor_language = "en"
|
||||
queries.get_all_greetings_for_language(conn, visitor_language)
|
||||
|
||||
|
||||
|
||||
Named Parameters
|
||||
****************
|
||||
|
||||
To make queries with many parameters more understandable and maintainable, you can give the parameters names:
|
||||
|
||||
.. code-block:: sql
|
||||
|
||||
-- name: get-greetings-for-language
|
||||
-- Get all the greetings in the database for given language and length
|
||||
SELECT *
|
||||
FROM greetings
|
||||
WHERE lang = :lang
|
||||
AND len(greeting) <= :length_limit;
|
||||
|
||||
If you were writing a Postgresql query, you could also format the parameters as ``%s(lang)`` and ``%s(length_limit)``.
|
||||
|
||||
Then, call your queries like you would any Python function with named parameters:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
visitor_language = "en"
|
||||
|
||||
greetings_for_texting = queries.get_all_greetings(conn, lang=visitor_language, length_limit=140)
|
||||
|
||||
|
||||
Contents
|
||||
--------
|
||||
.. toctree::
|
||||
:maxdepth: 2
|
||||
|
||||
Getting Started <getting_started>
|
||||
Defining Queries <defining_queries>
|
||||
Extending anosql <extending>
|
||||
Upgrading <upgrading>
|
||||
API <source/modules>
|
||||
|
||||
License
|
||||
-------
|
||||
|
||||
BSD, short and sweet
|
||||
|
||||
.. _Yesql: https://github.com/krisajenkins/yesql/
|
281
projects/anosql/doc/make.bat
Normal file
281
projects/anosql/doc/make.bat
Normal file
|
@ -0,0 +1,281 @@
|
|||
@ECHO OFF
|
||||
|
||||
REM Command file for Sphinx documentation
|
||||
|
||||
if "%SPHINXBUILD%" == "" (
|
||||
set SPHINXBUILD=sphinx-build
|
||||
)
|
||||
set BUILDDIR=_build
|
||||
set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% .
|
||||
set I18NSPHINXOPTS=%SPHINXOPTS% .
|
||||
if NOT "%PAPER%" == "" (
|
||||
set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
|
||||
set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
|
||||
)
|
||||
|
||||
if "%1" == "" goto help
|
||||
|
||||
if "%1" == "help" (
|
||||
:help
|
||||
echo.Please use `make ^<target^>` where ^<target^> is one of
|
||||
echo. html to make standalone HTML files
|
||||
echo. dirhtml to make HTML files named index.html in directories
|
||||
echo. singlehtml to make a single large HTML file
|
||||
echo. pickle to make pickle files
|
||||
echo. json to make JSON files
|
||||
echo. htmlhelp to make HTML files and a HTML help project
|
||||
echo. qthelp to make HTML files and a qthelp project
|
||||
echo. devhelp to make HTML files and a Devhelp project
|
||||
echo. epub to make an epub
|
||||
echo. epub3 to make an epub3
|
||||
echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
|
||||
echo. text to make text files
|
||||
echo. man to make manual pages
|
||||
echo. texinfo to make Texinfo files
|
||||
echo. gettext to make PO message catalogs
|
||||
echo. changes to make an overview over all changed/added/deprecated items
|
||||
echo. xml to make Docutils-native XML files
|
||||
echo. pseudoxml to make pseudoxml-XML files for display purposes
|
||||
echo. linkcheck to check all external links for integrity
|
||||
echo. doctest to run all doctests embedded in the documentation if enabled
|
||||
echo. coverage to run coverage check of the documentation if enabled
|
||||
echo. dummy to check syntax errors of document sources
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "clean" (
|
||||
for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
|
||||
del /q /s %BUILDDIR%\*
|
||||
goto end
|
||||
)
|
||||
|
||||
|
||||
REM Check if sphinx-build is available and fallback to Python version if any
|
||||
%SPHINXBUILD% 1>NUL 2>NUL
|
||||
if errorlevel 9009 goto sphinx_python
|
||||
goto sphinx_ok
|
||||
|
||||
:sphinx_python
|
||||
|
||||
set SPHINXBUILD=python -m sphinx.__init__
|
||||
%SPHINXBUILD% 2> nul
|
||||
if errorlevel 9009 (
|
||||
echo.
|
||||
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
|
||||
echo.installed, then set the SPHINXBUILD environment variable to point
|
||||
echo.to the full path of the 'sphinx-build' executable. Alternatively you
|
||||
echo.may add the Sphinx directory to PATH.
|
||||
echo.
|
||||
echo.If you don't have Sphinx installed, grab it from
|
||||
echo.http://sphinx-doc.org/
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
:sphinx_ok
|
||||
|
||||
|
||||
if "%1" == "html" (
|
||||
%SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. The HTML pages are in %BUILDDIR%/html.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "dirhtml" (
|
||||
%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "singlehtml" (
|
||||
%SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "pickle" (
|
||||
%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished; now you can process the pickle files.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "json" (
|
||||
%SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished; now you can process the JSON files.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "htmlhelp" (
|
||||
%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished; now you can run HTML Help Workshop with the ^
|
||||
.hhp project file in %BUILDDIR%/htmlhelp.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "qthelp" (
|
||||
%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished; now you can run "qcollectiongenerator" with the ^
|
||||
.qhcp project file in %BUILDDIR%/qthelp, like this:
|
||||
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\anosql.qhcp
|
||||
echo.To view the help file:
|
||||
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\anosql.ghc
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "devhelp" (
|
||||
%SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "epub" (
|
||||
%SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. The epub file is in %BUILDDIR%/epub.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "epub3" (
|
||||
%SPHINXBUILD% -b epub3 %ALLSPHINXOPTS% %BUILDDIR%/epub3
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. The epub3 file is in %BUILDDIR%/epub3.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "latex" (
|
||||
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "latexpdf" (
|
||||
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
|
||||
cd %BUILDDIR%/latex
|
||||
make all-pdf
|
||||
cd %~dp0
|
||||
echo.
|
||||
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "latexpdfja" (
|
||||
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
|
||||
cd %BUILDDIR%/latex
|
||||
make all-pdf-ja
|
||||
cd %~dp0
|
||||
echo.
|
||||
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "text" (
|
||||
%SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. The text files are in %BUILDDIR%/text.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "man" (
|
||||
%SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. The manual pages are in %BUILDDIR%/man.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "texinfo" (
|
||||
%SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "gettext" (
|
||||
%SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "changes" (
|
||||
%SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.The overview file is in %BUILDDIR%/changes.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "linkcheck" (
|
||||
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Link check complete; look for any errors in the above output ^
|
||||
or in %BUILDDIR%/linkcheck/output.txt.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "doctest" (
|
||||
%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Testing of doctests in the sources finished, look at the ^
|
||||
results in %BUILDDIR%/doctest/output.txt.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "coverage" (
|
||||
%SPHINXBUILD% -b coverage %ALLSPHINXOPTS% %BUILDDIR%/coverage
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Testing of coverage in the sources finished, look at the ^
|
||||
results in %BUILDDIR%/coverage/python.txt.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "xml" (
|
||||
%SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. The XML files are in %BUILDDIR%/xml.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "pseudoxml" (
|
||||
%SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml.
|
||||
goto end
|
||||
)
|
||||
|
||||
if "%1" == "dummy" (
|
||||
%SPHINXBUILD% -b dummy %ALLSPHINXOPTS% %BUILDDIR%/dummy
|
||||
if errorlevel 1 exit /b 1
|
||||
echo.
|
||||
echo.Build finished. Dummy builder generates no files.
|
||||
goto end
|
||||
)
|
||||
|
||||
:end
|
7
projects/anosql/doc/source/anosql.adapters.psycopg2.rst
Normal file
7
projects/anosql/doc/source/anosql.adapters.psycopg2.rst
Normal file
|
@ -0,0 +1,7 @@
|
|||
anosql.adapters.psycopg2 module
|
||||
===============================
|
||||
|
||||
.. automodule:: anosql.adapters.psycopg2
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
18
projects/anosql/doc/source/anosql.adapters.rst
Normal file
18
projects/anosql/doc/source/anosql.adapters.rst
Normal file
|
@ -0,0 +1,18 @@
|
|||
anosql.adapters package
|
||||
=======================
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
.. toctree::
|
||||
|
||||
anosql.adapters.psycopg2
|
||||
anosql.adapters.sqlite3
|
||||
|
||||
Module contents
|
||||
---------------
|
||||
|
||||
.. automodule:: anosql.adapters
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
7
projects/anosql/doc/source/anosql.adapters.sqlite3.rst
Normal file
7
projects/anosql/doc/source/anosql.adapters.sqlite3.rst
Normal file
|
@ -0,0 +1,7 @@
|
|||
anosql.adapters.sqlite3 module
|
||||
==============================
|
||||
|
||||
.. automodule:: anosql.adapters.sqlite3
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
7
projects/anosql/doc/source/anosql.core.rst
Normal file
7
projects/anosql/doc/source/anosql.core.rst
Normal file
|
@ -0,0 +1,7 @@
|
|||
anosql.core module
|
||||
==================
|
||||
|
||||
.. automodule:: anosql.core
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
7
projects/anosql/doc/source/anosql.exceptions.rst
Normal file
7
projects/anosql/doc/source/anosql.exceptions.rst
Normal file
|
@ -0,0 +1,7 @@
|
|||
anosql.exceptions module
|
||||
========================
|
||||
|
||||
.. automodule:: anosql.exceptions
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
7
projects/anosql/doc/source/anosql.patterns.rst
Normal file
7
projects/anosql/doc/source/anosql.patterns.rst
Normal file
|
@ -0,0 +1,7 @@
|
|||
anosql.patterns module
|
||||
======================
|
||||
|
||||
.. automodule:: anosql.patterns
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
26
projects/anosql/doc/source/anosql.rst
Normal file
26
projects/anosql/doc/source/anosql.rst
Normal file
|
@ -0,0 +1,26 @@
|
|||
anosql package
|
||||
==============
|
||||
|
||||
Subpackages
|
||||
-----------
|
||||
|
||||
.. toctree::
|
||||
|
||||
anosql.adapters
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
.. toctree::
|
||||
|
||||
anosql.core
|
||||
anosql.exceptions
|
||||
anosql.patterns
|
||||
|
||||
Module contents
|
||||
---------------
|
||||
|
||||
.. automodule:: anosql
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
7
projects/anosql/doc/source/modules.rst
Normal file
7
projects/anosql/doc/source/modules.rst
Normal file
|
@ -0,0 +1,7 @@
|
|||
anosql
|
||||
======
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 4
|
||||
|
||||
anosql
|
83
projects/anosql/doc/upgrading.rst
Normal file
83
projects/anosql/doc/upgrading.rst
Normal file
|
@ -0,0 +1,83 @@
|
|||
#########
|
||||
Upgrading
|
||||
#########
|
||||
|
||||
Upgrading from 0.x to 1.x
|
||||
=========================
|
||||
|
||||
Changed ``load_queries`` and ``load_queries_from_string``
|
||||
---------------------------------------------------------
|
||||
|
||||
These methods were changed, mostly for brevity. To load ``anosql`` queries, you should now use
|
||||
the ``anosql.from_str`` to load queries from a SQL string, and ``anosql.from_path`` to load queries
|
||||
from a SQL file, or directory of SQL files.
|
||||
|
||||
Removed the ``$`` "record" operator
|
||||
-----------------------------------
|
||||
|
||||
Because most database drivers have more efficient, robust, and featureful ways of controlling the
|
||||
rows and records output, this feature was removed.
|
||||
|
||||
See:
|
||||
|
||||
* `sqlite.Row <https://docs.python.org/2/library/sqlite3.html#sqlite3.Row>`_
|
||||
* `psycopg2 - Connection and Cursor subclasses <http://initd.org/psycopg/docs/extras.html#connection-and-cursor-subclasses>`_
|
||||
|
||||
|
||||
SQLite example::
|
||||
|
||||
conn = sqlite3.connect("...")
|
||||
conn.row_factory = sqlite3.Row
|
||||
actual = queries.get_all_users(conn)
|
||||
|
||||
assert actual[0]["userid"] == 1
|
||||
assert actual[0]["username"] == "bobsmith"
|
||||
assert actual[0][2] == "Bob"
|
||||
assert actual[0]["lastname" == "Smith"
|
||||
|
||||
PostgreSQL example::
|
||||
|
||||
with psycopg2.connect("...", cursor_factory=psycopg2.extras.RealDictCursor) as conn:
|
||||
actual = queries.get_all_users(conn)
|
||||
|
||||
assert actual[0] == {
|
||||
"userid": 1,
|
||||
"username": "bobsmith",
|
||||
"firstname": "Bob",
|
||||
"lastname": "Smith",
|
||||
}
|
||||
|
||||
Driver Adapter classes instead of QueryLoader
|
||||
---------------------------------------------
|
||||
|
||||
I'm not aware of anyone who actually has made or distributed an extension for ``anosql``, as it was
|
||||
only available in its current form for a few weeks. So this notice is really just for completeness.
|
||||
|
||||
For ``0.3.x`` versions of ``anosql`` in order to add a new database extensions you had to build a
|
||||
subclass of ``anosql.QueryLoader``. This base class is no longer available, and driver adapters no
|
||||
longer have to extend from any class at all. They are duck-typed classes which are expected to
|
||||
adhere to a standard interface. For more information about this see :ref:`Extending anosql <extending-anosql>`.
|
||||
|
||||
New Things
|
||||
==========
|
||||
|
||||
Use the database driver ``cursor`` directly
|
||||
-------------------------------------------
|
||||
|
||||
All the queries with a `SELECT` type have a duplicate method suffixed by `_cursor` which is a context manager to the database cursor. So `get_all_blogs(conn)` can also be used as:
|
||||
|
||||
::
|
||||
|
||||
rows = queries.get_all_blogs(conn)
|
||||
# [(1, "My Blog", "yadayada"), ...]
|
||||
|
||||
with queries.get_all_blogs_cursor(conn) as cur:
|
||||
# All the power of the underlying cursor object! Not limited to just a list of rows.
|
||||
for row in cur:
|
||||
print(row)
|
||||
|
||||
|
||||
New operator types for runnings scripts ``#`` and bulk-inserts ``*!``
|
||||
---------------------------------------------------------------------
|
||||
|
||||
See :ref:`Query Operations <query-operations>`
|
4
projects/anosql/src/python/anosql/__init__.py
Normal file
4
projects/anosql/src/python/anosql/__init__.py
Normal file
|
@ -0,0 +1,4 @@
|
|||
from .core import from_path, from_str, SQLOperationType
|
||||
from .exceptions import SQLLoadException, SQLParseException
|
||||
|
||||
__all__ = ["from_path", "from_str", "SQLOperationType", "SQLLoadException", "SQLParseException"]
|
0
projects/anosql/src/python/anosql/adapters/__init__.py
Normal file
0
projects/anosql/src/python/anosql/adapters/__init__.py
Normal file
61
projects/anosql/src/python/anosql/adapters/psycopg2.py
Normal file
61
projects/anosql/src/python/anosql/adapters/psycopg2.py
Normal file
|
@ -0,0 +1,61 @@
|
|||
from contextlib import contextmanager
|
||||
|
||||
from ..patterns import var_pattern
|
||||
|
||||
|
||||
def replacer(match):
|
||||
gd = match.groupdict()
|
||||
if gd['dblquote'] is not None:
|
||||
return gd['dblquote']
|
||||
elif gd['quote'] is not None:
|
||||
return gd["quote"]
|
||||
else:
|
||||
return '{lead}%({var_name})s{trail}'.format(
|
||||
lead=gd['lead'],
|
||||
var_name=gd['var_name'],
|
||||
trail=gd['trail'],
|
||||
)
|
||||
|
||||
|
||||
class PsycoPG2Adapter(object):
|
||||
@staticmethod
|
||||
def process_sql(_query_name, _op_type, sql):
|
||||
return var_pattern.sub(replacer, sql)
|
||||
|
||||
@staticmethod
|
||||
def select(conn, _query_name, sql, parameters):
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(sql, parameters)
|
||||
return cur.fetchall()
|
||||
|
||||
@staticmethod
|
||||
@contextmanager
|
||||
def select_cursor(conn, _query_name, sql, parameters):
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(sql, parameters)
|
||||
yield cur
|
||||
|
||||
@staticmethod
|
||||
def insert_update_delete(conn, _query_name, sql, parameters):
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(sql, parameters)
|
||||
|
||||
@staticmethod
|
||||
def insert_update_delete_many(conn, _query_name, sql, parameters):
|
||||
with conn.cursor() as cur:
|
||||
cur.executemany(sql, parameters)
|
||||
|
||||
@staticmethod
|
||||
def insert_returning(conn, _query_name, sql, parameters):
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(sql, parameters)
|
||||
res = cur.fetchone()
|
||||
if res:
|
||||
return res[0] if len(res) == 1 else res
|
||||
else:
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def execute_script(conn, sql):
|
||||
with conn.cursor() as cur:
|
||||
cur.execute(sql)
|
57
projects/anosql/src/python/anosql/adapters/sqlite3.py
Normal file
57
projects/anosql/src/python/anosql/adapters/sqlite3.py
Normal file
|
@ -0,0 +1,57 @@
|
|||
from contextlib import contextmanager
|
||||
|
||||
|
||||
class SQLite3DriverAdapter(object):
|
||||
@staticmethod
|
||||
def process_sql(_query_name, _op_type, sql):
|
||||
"""Pass through function because the ``sqlite3`` driver already handles the :var_name
|
||||
"named style" syntax used by anosql variables. Note, it will also accept "qmark style"
|
||||
variables.
|
||||
|
||||
Args:
|
||||
_query_name (str): The name of the sql query. Unused.
|
||||
_op_type (anosql.SQLOperationType): The type of SQL operation performed by the sql.
|
||||
sql (str): The sql as written before processing.
|
||||
|
||||
Returns:
|
||||
str: Original SQL text unchanged.
|
||||
"""
|
||||
return sql
|
||||
|
||||
@staticmethod
|
||||
def select(conn, _query_name, sql, parameters):
|
||||
cur = conn.cursor()
|
||||
cur.execute(sql, parameters)
|
||||
results = cur.fetchall()
|
||||
cur.close()
|
||||
return results
|
||||
|
||||
@staticmethod
|
||||
@contextmanager
|
||||
def select_cursor(conn, _query_name, sql, parameters):
|
||||
cur = conn.cursor()
|
||||
cur.execute(sql, parameters)
|
||||
try:
|
||||
yield cur
|
||||
finally:
|
||||
cur.close()
|
||||
|
||||
@staticmethod
|
||||
def insert_update_delete(conn, _query_name, sql, parameters):
|
||||
conn.execute(sql, parameters)
|
||||
|
||||
@staticmethod
|
||||
def insert_update_delete_many(conn, _query_name, sql, parameters):
|
||||
conn.executemany(sql, parameters)
|
||||
|
||||
@staticmethod
|
||||
def insert_returning(conn, _query_name, sql, parameters):
|
||||
cur = conn.cursor()
|
||||
cur.execute(sql, parameters)
|
||||
results = cur.lastrowid
|
||||
cur.close()
|
||||
return results
|
||||
|
||||
@staticmethod
|
||||
def execute_script(conn, sql):
|
||||
conn.executescript(sql)
|
355
projects/anosql/src/python/anosql/core.py
Normal file
355
projects/anosql/src/python/anosql/core.py
Normal file
|
@ -0,0 +1,355 @@
|
|||
import os
|
||||
|
||||
from .adapters.psycopg2 import PsycoPG2Adapter
|
||||
from .adapters.sqlite3 import SQLite3DriverAdapter
|
||||
from .exceptions import SQLLoadException, SQLParseException
|
||||
from .patterns import (
|
||||
query_name_definition_pattern,
|
||||
empty_pattern,
|
||||
doc_comment_pattern,
|
||||
valid_query_name_pattern,
|
||||
)
|
||||
|
||||
|
||||
_ADAPTERS = {
|
||||
"psycopg2": PsycoPG2Adapter,
|
||||
"sqlite3": SQLite3DriverAdapter,
|
||||
}
|
||||
|
||||
|
||||
def register_driver_adapter(driver_name, driver_adapter):
|
||||
"""Registers custom driver adapter classes to extend ``anosql`` to to handle additional drivers.
|
||||
|
||||
For details on how to create a new driver adapter see :ref:`driver-adapters` documentation.
|
||||
|
||||
Args:
|
||||
driver_name (str): The driver type name.
|
||||
driver_adapter (callable): Either n class or function which creates an instance of a
|
||||
driver adapter.
|
||||
|
||||
Returns:
|
||||
None
|
||||
|
||||
Examples:
|
||||
To register a new loader::
|
||||
|
||||
class MyDbAdapter():
|
||||
def process_sql(self, name, op_type, sql):
|
||||
pass
|
||||
|
||||
def select(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
@contextmanager
|
||||
def select_cursor(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def insert_update_delete(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def insert_update_delete_many(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def insert_returning(self, conn, sql, parameters):
|
||||
pass
|
||||
|
||||
def execute_script(self, conn, sql):
|
||||
pass
|
||||
|
||||
|
||||
anosql.register_driver_adapter("mydb", MyDbAdapter)
|
||||
|
||||
If your adapter constructor takes arguments you can register a function which can build
|
||||
your adapter instance::
|
||||
|
||||
def adapter_factory():
|
||||
return MyDbAdapter("foo", 42)
|
||||
|
||||
anosql.register_driver_adapter("mydb", adapter_factory)
|
||||
|
||||
"""
|
||||
_ADAPTERS[driver_name] = driver_adapter
|
||||
|
||||
|
||||
def get_driver_adapter(driver_name):
|
||||
"""Get the driver adapter instance registered by the ``driver_name``.
|
||||
|
||||
Args:
|
||||
driver_name (str): The database driver name.
|
||||
|
||||
Returns:
|
||||
object: A driver adapter class.
|
||||
"""
|
||||
try:
|
||||
driver_adapter = _ADAPTERS[driver_name]
|
||||
except KeyError:
|
||||
raise ValueError("Encountered unregistered driver_name: {}".format(driver_name))
|
||||
|
||||
return driver_adapter()
|
||||
|
||||
|
||||
class SQLOperationType(object):
|
||||
"""Enumeration (kind of) of anosql operation types
|
||||
"""
|
||||
INSERT_RETURNING = 0
|
||||
INSERT_UPDATE_DELETE = 1
|
||||
INSERT_UPDATE_DELETE_MANY = 2
|
||||
SCRIPT = 3
|
||||
SELECT = 4
|
||||
SELECT_ONE_ROW = 5
|
||||
|
||||
|
||||
class Queries:
|
||||
"""Container object with dynamic methods built from SQL queries.
|
||||
|
||||
The ``-- name`` definition comments in the SQL content determine what the dynamic
|
||||
methods of this class will be named.
|
||||
|
||||
@DynamicAttrs
|
||||
"""
|
||||
|
||||
def __init__(self, queries=None):
|
||||
"""Queries constructor.
|
||||
|
||||
Args:
|
||||
queries (list(tuple)):
|
||||
"""
|
||||
if queries is None:
|
||||
queries = []
|
||||
self._available_queries = set()
|
||||
|
||||
for query_name, fn in queries:
|
||||
self.add_query(query_name, fn)
|
||||
|
||||
@property
|
||||
def available_queries(self):
|
||||
"""Returns listing of all the available query methods loaded in this class.
|
||||
|
||||
Returns:
|
||||
list(str): List of dot-separated method accessor names.
|
||||
"""
|
||||
return sorted(self._available_queries)
|
||||
|
||||
def __repr__(self):
|
||||
return "Queries(" + self.available_queries.__repr__() + ")"
|
||||
|
||||
def add_query(self, query_name, fn):
|
||||
"""Adds a new dynamic method to this class.
|
||||
|
||||
Args:
|
||||
query_name (str): The method name as found in the SQL content.
|
||||
fn (function): The loaded query function.
|
||||
|
||||
Returns:
|
||||
|
||||
"""
|
||||
setattr(self, query_name, fn)
|
||||
self._available_queries.add(query_name)
|
||||
|
||||
def add_child_queries(self, child_name, child_queries):
|
||||
"""Adds a Queries object as a property.
|
||||
|
||||
Args:
|
||||
child_name (str): The property name to group the child queries under.
|
||||
child_queries (Queries): Queries instance to add as sub-queries.
|
||||
|
||||
Returns:
|
||||
None
|
||||
|
||||
"""
|
||||
setattr(self, child_name, child_queries)
|
||||
for child_query_name in child_queries.available_queries:
|
||||
self._available_queries.add("{}.{}".format(child_name, child_query_name))
|
||||
|
||||
|
||||
def _create_fns(query_name, docs, op_type, sql, driver_adapter):
|
||||
def fn(conn, *args, **kwargs):
|
||||
parameters = kwargs if len(kwargs) > 0 else args
|
||||
if op_type == SQLOperationType.INSERT_RETURNING:
|
||||
return driver_adapter.insert_returning(conn, query_name, sql, parameters)
|
||||
elif op_type == SQLOperationType.INSERT_UPDATE_DELETE:
|
||||
return driver_adapter.insert_update_delete(conn, query_name, sql, parameters)
|
||||
elif op_type == SQLOperationType.INSERT_UPDATE_DELETE_MANY:
|
||||
return driver_adapter.insert_update_delete_many(conn, query_name, sql, *parameters)
|
||||
elif op_type == SQLOperationType.SCRIPT:
|
||||
return driver_adapter.execute_script(conn, sql)
|
||||
elif op_type == SQLOperationType.SELECT_ONE_ROW:
|
||||
res = driver_adapter.select(conn, query_name, sql, parameters)
|
||||
return res[0] if len(res) == 1 else None
|
||||
elif op_type == SQLOperationType.SELECT:
|
||||
return driver_adapter.select(conn, query_name, sql, parameters)
|
||||
else:
|
||||
raise ValueError("Unknown op_type: {}".format(op_type))
|
||||
|
||||
fn.__name__ = query_name
|
||||
fn.__doc__ = docs
|
||||
fn.sql = sql
|
||||
|
||||
ctx_mgr_method_name = "{}_cursor".format(query_name)
|
||||
|
||||
def ctx_mgr(conn, *args, **kwargs):
|
||||
parameters = kwargs if len(kwargs) > 0 else args
|
||||
return driver_adapter.select_cursor(conn, query_name, sql, parameters)
|
||||
|
||||
ctx_mgr.__name__ = ctx_mgr_method_name
|
||||
ctx_mgr.__doc__ = docs
|
||||
ctx_mgr.sql = sql
|
||||
|
||||
if op_type == SQLOperationType.SELECT:
|
||||
return [(query_name, fn), (ctx_mgr_method_name, ctx_mgr)]
|
||||
|
||||
return [(query_name, fn)]
|
||||
|
||||
|
||||
def load_methods(sql_text, driver_adapter):
|
||||
lines = sql_text.strip().splitlines()
|
||||
query_name = lines[0].replace("-", "_")
|
||||
|
||||
if query_name.endswith("<!"):
|
||||
op_type = SQLOperationType.INSERT_RETURNING
|
||||
query_name = query_name[:-2]
|
||||
elif query_name.endswith("*!"):
|
||||
op_type = SQLOperationType.INSERT_UPDATE_DELETE_MANY
|
||||
query_name = query_name[:-2]
|
||||
elif query_name.endswith("!"):
|
||||
op_type = SQLOperationType.INSERT_UPDATE_DELETE
|
||||
query_name = query_name[:-1]
|
||||
elif query_name.endswith("#"):
|
||||
op_type = SQLOperationType.SCRIPT
|
||||
query_name = query_name[:-1]
|
||||
elif query_name.endswith("?"):
|
||||
op_type = SQLOperationType.SELECT_ONE_ROW
|
||||
query_name = query_name[:-1]
|
||||
else:
|
||||
op_type = SQLOperationType.SELECT
|
||||
|
||||
if not valid_query_name_pattern.match(query_name):
|
||||
raise SQLParseException(
|
||||
'name must convert to valid python variable, got "{}".'.format(query_name)
|
||||
)
|
||||
|
||||
docs = ""
|
||||
sql = ""
|
||||
for line in lines[1:]:
|
||||
match = doc_comment_pattern.match(line)
|
||||
if match:
|
||||
docs += match.group(1) + "\n"
|
||||
else:
|
||||
sql += line + "\n"
|
||||
|
||||
docs = docs.strip()
|
||||
sql = driver_adapter.process_sql(query_name, op_type, sql.strip())
|
||||
|
||||
return _create_fns(query_name, docs, op_type, sql, driver_adapter)
|
||||
|
||||
|
||||
def load_queries_from_sql(sql, driver_adapter):
|
||||
queries = []
|
||||
for query_text in query_name_definition_pattern.split(sql):
|
||||
if not empty_pattern.match(query_text):
|
||||
for method_pair in load_methods(query_text, driver_adapter):
|
||||
queries.append(method_pair)
|
||||
return queries
|
||||
|
||||
|
||||
def load_queries_from_file(file_path, driver_adapter):
|
||||
with open(file_path) as fp:
|
||||
return load_queries_from_sql(fp.read(), driver_adapter)
|
||||
|
||||
|
||||
def load_queries_from_dir_path(dir_path, query_loader):
|
||||
if not os.path.isdir(dir_path):
|
||||
raise ValueError("The path {} must be a directory".format(dir_path))
|
||||
|
||||
def _recurse_load_queries(path):
|
||||
queries = Queries()
|
||||
for item in os.listdir(path):
|
||||
item_path = os.path.join(path, item)
|
||||
if os.path.isfile(item_path) and not item.endswith(".sql"):
|
||||
continue
|
||||
elif os.path.isfile(item_path) and item.endswith(".sql"):
|
||||
for name, fn in load_queries_from_file(item_path, query_loader):
|
||||
queries.add_query(name, fn)
|
||||
elif os.path.isdir(item_path):
|
||||
child_queries = _recurse_load_queries(item_path)
|
||||
queries.add_child_queries(item, child_queries)
|
||||
else:
|
||||
# This should be practically unreachable.
|
||||
raise SQLLoadException(
|
||||
"The path must be a directory or file, got {}".format(item_path)
|
||||
)
|
||||
return queries
|
||||
|
||||
return _recurse_load_queries(dir_path)
|
||||
|
||||
|
||||
def from_str(sql, driver_name):
|
||||
"""Load queries from a SQL string.
|
||||
|
||||
Args:
|
||||
sql (str) A string containing SQL statements and anosql name:
|
||||
driver_name (str): The database driver to use to load and execute queries.
|
||||
|
||||
Returns:
|
||||
Queries
|
||||
|
||||
Example:
|
||||
Loading queries from a SQL string::
|
||||
|
||||
import sqlite3
|
||||
import anosql
|
||||
|
||||
sql_text = \"""
|
||||
-- name: get-all-greetings
|
||||
-- Get all the greetings in the database
|
||||
select * from greetings;
|
||||
|
||||
-- name: get-users-by-username
|
||||
-- Get all the users from the database,
|
||||
-- and return it as a dict
|
||||
select * from users where username =:username;
|
||||
\"""
|
||||
|
||||
queries = anosql.from_str(sql_text, db_driver="sqlite3")
|
||||
queries.get_all_greetings(conn)
|
||||
queries.get_users_by_username(conn, username="willvaughn")
|
||||
|
||||
"""
|
||||
driver_adapter = get_driver_adapter(driver_name)
|
||||
return Queries(load_queries_from_sql(sql, driver_adapter))
|
||||
|
||||
|
||||
def from_path(sql_path, driver_name):
|
||||
"""Load queries from a sql file, or a directory of sql files.
|
||||
|
||||
Args:
|
||||
sql_path (str): Path to a ``.sql`` file or directory containing ``.sql`` files.
|
||||
driver_name (str): The database driver to use to load and execute queries.
|
||||
|
||||
Returns:
|
||||
Queries
|
||||
|
||||
Example:
|
||||
Loading queries paths::
|
||||
|
||||
import sqlite3
|
||||
import anosql
|
||||
|
||||
queries = anosql.from_path("./greetings.sql", driver_name="sqlite3")
|
||||
queries2 = anosql.from_path("./sql_dir", driver_name="sqlite3")
|
||||
|
||||
"""
|
||||
if not os.path.exists(sql_path):
|
||||
raise SQLLoadException('File does not exist: {}.'.format(sql_path), sql_path)
|
||||
|
||||
driver_adapter = get_driver_adapter(driver_name)
|
||||
|
||||
if os.path.isdir(sql_path):
|
||||
return load_queries_from_dir_path(sql_path, driver_adapter)
|
||||
elif os.path.isfile(sql_path):
|
||||
return Queries(load_queries_from_file(sql_path, driver_adapter))
|
||||
else:
|
||||
raise SQLLoadException(
|
||||
'The sql_path must be a directory or file, got {}'.format(sql_path),
|
||||
sql_path
|
||||
)
|
6
projects/anosql/src/python/anosql/exceptions.py
Normal file
6
projects/anosql/src/python/anosql/exceptions.py
Normal file
|
@ -0,0 +1,6 @@
|
|||
class SQLLoadException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class SQLParseException(Exception):
|
||||
pass
|
30
projects/anosql/src/python/anosql/patterns.py
Normal file
30
projects/anosql/src/python/anosql/patterns.py
Normal file
|
@ -0,0 +1,30 @@
|
|||
import re
|
||||
|
||||
query_name_definition_pattern = re.compile(r"--\s*name\s*:\s*")
|
||||
"""
|
||||
Pattern: Identifies name definition comments.
|
||||
"""
|
||||
|
||||
empty_pattern = re.compile(r"^\s*$")
|
||||
"""
|
||||
Pattern: Identifies empty lines.
|
||||
"""
|
||||
|
||||
valid_query_name_pattern = re.compile(r"\w+")
|
||||
"""
|
||||
Pattern: Enforces names are valid python variable names.
|
||||
"""
|
||||
|
||||
doc_comment_pattern = re.compile(r"\s*--\s*(.*)$")
|
||||
"""
|
||||
Pattern: Identifies SQL comments.
|
||||
"""
|
||||
|
||||
var_pattern = re.compile(
|
||||
r'(?P<dblquote>"[^"]+")|'
|
||||
r"(?P<quote>\'[^\']+\')|"
|
||||
r"(?P<lead>[^:]):(?P<var_name>[\w-]+)(?P<trail>[^:]?)"
|
||||
)
|
||||
"""
|
||||
Pattern: Identifies variable definitions in SQL code.
|
||||
"""
|
0
projects/anosql/test/python/__init__.py
Normal file
0
projects/anosql/test/python/__init__.py
Normal file
3
projects/anosql/test/python/blogdb/data/blogs_data.csv
Normal file
3
projects/anosql/test/python/blogdb/data/blogs_data.csv
Normal file
|
@ -0,0 +1,3 @@
|
|||
1,What I did Today,"I mowed the lawn - washed some clothes - ate a burger.",2017-07-28
|
||||
3,Testing,Is this thing on?,2018-01-01
|
||||
1,How to make a pie.,"1. Make crust\n2. Fill\n3. Bake\n4.Eat",2018-11-23
|
|
3
projects/anosql/test/python/blogdb/data/users_data.csv
Normal file
3
projects/anosql/test/python/blogdb/data/users_data.csv
Normal file
|
@ -0,0 +1,3 @@
|
|||
bobsmith,Bob,Smith
|
||||
johndoe,John,Doe
|
||||
janedoe,Jane,Doe
|
|
26
projects/anosql/test/python/blogdb/sql/blogs/blogs.sql
Normal file
26
projects/anosql/test/python/blogdb/sql/blogs/blogs.sql
Normal file
|
@ -0,0 +1,26 @@
|
|||
-- name: publish-blog<!
|
||||
insert into blogs (
|
||||
userid,
|
||||
title,
|
||||
content,
|
||||
published
|
||||
)
|
||||
values (
|
||||
:userid,
|
||||
:title,
|
||||
:content,
|
||||
:published
|
||||
)
|
||||
|
||||
-- name: remove-blog!
|
||||
-- Remove a blog from the database
|
||||
delete from blogs where blogid = :blogid;
|
||||
|
||||
|
||||
-- name: get-user-blogs
|
||||
-- Get blogs authored by a user.
|
||||
select title,
|
||||
published
|
||||
from blogs
|
||||
where userid = :userid
|
||||
order by published desc;
|
41
projects/anosql/test/python/blogdb/sql/blogs/blogs_pg.sql
Normal file
41
projects/anosql/test/python/blogdb/sql/blogs/blogs_pg.sql
Normal file
|
@ -0,0 +1,41 @@
|
|||
-- name: pg-get-blogs-published-after
|
||||
-- Get all blogs by all authors published after the given date.
|
||||
select title,
|
||||
username,
|
||||
to_char(published, 'YYYY-MM-DD HH24:MI') as published
|
||||
from blogs
|
||||
join users using(userid)
|
||||
where published >= :published
|
||||
order by published desc;
|
||||
|
||||
|
||||
-- name: pg-publish-blog<!
|
||||
insert into blogs (
|
||||
userid,
|
||||
title,
|
||||
content,
|
||||
published
|
||||
)
|
||||
values (
|
||||
:userid,
|
||||
:title,
|
||||
:content,
|
||||
:published
|
||||
)
|
||||
returning blogid, title;
|
||||
|
||||
|
||||
-- name: pg-bulk-publish*!
|
||||
-- Insert many blogs at once
|
||||
insert into blogs (
|
||||
userid,
|
||||
title,
|
||||
content,
|
||||
published
|
||||
)
|
||||
values (
|
||||
:userid,
|
||||
:title,
|
||||
:content,
|
||||
:published
|
||||
)
|
|
@ -0,0 +1,20 @@
|
|||
-- name: sqlite-get-blogs-published-after
|
||||
-- Get all blogs by all authors published after the given date.
|
||||
select b.title,
|
||||
u.username,
|
||||
strftime('%Y-%m-%d %H:%M', b.published) as published
|
||||
from blogs b
|
||||
inner join users u on b.userid = u.userid
|
||||
where b.published >= :published
|
||||
order by b.published desc;
|
||||
|
||||
|
||||
-- name: sqlite-bulk-publish*!
|
||||
-- Insert many blogs at once
|
||||
insert into blogs (
|
||||
userid,
|
||||
title,
|
||||
content,
|
||||
published
|
||||
)
|
||||
values (?, ?, ? , ?);
|
11
projects/anosql/test/python/blogdb/sql/users/users.sql
Normal file
11
projects/anosql/test/python/blogdb/sql/users/users.sql
Normal file
|
@ -0,0 +1,11 @@
|
|||
-- name: get-all
|
||||
-- Get all user records
|
||||
select * from users;
|
||||
|
||||
-- name: get-all-sorted
|
||||
-- Get all user records sorted by username
|
||||
select * from users order by username asc;
|
||||
|
||||
-- name: get-one?
|
||||
-- Get one user based on its id
|
||||
select username, firstname, lastname from users where userid = %s;
|
116
projects/anosql/test/python/conftest.py
Normal file
116
projects/anosql/test/python/conftest.py
Normal file
|
@ -0,0 +1,116 @@
|
|||
import csv
|
||||
import os
|
||||
import sqlite3
|
||||
|
||||
import pytest
|
||||
|
||||
BLOGDB_PATH = os.path.join(os.path.dirname(os.path.abspath(__file__)), "blogdb")
|
||||
USERS_DATA_PATH = os.path.join(BLOGDB_PATH, "data", "users_data.csv")
|
||||
BLOGS_DATA_PATH = os.path.join(BLOGDB_PATH, "data", "blogs_data.csv")
|
||||
|
||||
|
||||
def populate_sqlite3_db(db_path):
|
||||
conn = sqlite3.connect(db_path)
|
||||
cur = conn.cursor()
|
||||
cur.executescript(
|
||||
"""
|
||||
create table users (
|
||||
userid integer not null primary key,
|
||||
username text not null,
|
||||
firstname integer not null,
|
||||
lastname text not null
|
||||
);
|
||||
|
||||
create table blogs (
|
||||
blogid integer not null primary key,
|
||||
userid integer not null,
|
||||
title text not null,
|
||||
content text not null,
|
||||
published date not null default CURRENT_DATE,
|
||||
foreign key(userid) references users(userid)
|
||||
);
|
||||
"""
|
||||
)
|
||||
|
||||
with open(USERS_DATA_PATH) as fp:
|
||||
users = list(csv.reader(fp))
|
||||
cur.executemany(
|
||||
"""
|
||||
insert into users (
|
||||
username,
|
||||
firstname,
|
||||
lastname
|
||||
) values (?, ?, ?);""",
|
||||
users,
|
||||
)
|
||||
with open(BLOGS_DATA_PATH) as fp:
|
||||
blogs = list(csv.reader(fp))
|
||||
cur.executemany(
|
||||
"""
|
||||
insert into blogs (
|
||||
userid,
|
||||
title,
|
||||
content,
|
||||
published
|
||||
) values (?, ?, ?, ?);""",
|
||||
blogs,
|
||||
)
|
||||
|
||||
conn.commit()
|
||||
conn.close()
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def sqlite3_db_path(tmpdir):
|
||||
db_path = os.path.join(tmpdir.strpath, "blogdb.db")
|
||||
populate_sqlite3_db(db_path)
|
||||
return db_path
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def sqlite3_conn(sqlite3_db_path):
|
||||
conn = sqlite3.connect(sqlite3_db_path)
|
||||
yield conn
|
||||
conn.close()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def pg_conn(postgresql):
|
||||
with postgresql:
|
||||
# Loads data from blogdb fixture data
|
||||
with postgresql.cursor() as cur:
|
||||
cur.execute(
|
||||
"""
|
||||
create table users (
|
||||
userid serial not null primary key,
|
||||
username varchar(32) not null,
|
||||
firstname varchar(255) not null,
|
||||
lastname varchar(255) not null
|
||||
);"""
|
||||
)
|
||||
cur.execute(
|
||||
"""
|
||||
create table blogs (
|
||||
blogid serial not null primary key,
|
||||
userid integer not null references users(userid),
|
||||
title varchar(255) not null,
|
||||
content text not null,
|
||||
published date not null default CURRENT_DATE
|
||||
);"""
|
||||
)
|
||||
|
||||
with postgresql.cursor() as cur:
|
||||
with open(USERS_DATA_PATH) as fp:
|
||||
cur.copy_from(fp, "users", sep=",", columns=["username", "firstname", "lastname"])
|
||||
with open(BLOGS_DATA_PATH) as fp:
|
||||
cur.copy_from(
|
||||
fp, "blogs", sep=",", columns=["userid", "title", "content", "published"]
|
||||
)
|
||||
|
||||
return postgresql
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def pg_dsn(pg_conn):
|
||||
p = pg_conn.get_dsn_parameters()
|
||||
return "postgres://{user}@{host}:{port}/{dbname}".format(**p)
|
123
projects/anosql/test/python/test_psycopg2.py
Normal file
123
projects/anosql/test/python/test_psycopg2.py
Normal file
|
@ -0,0 +1,123 @@
|
|||
import os
|
||||
from datetime import date
|
||||
|
||||
import anosql
|
||||
import psycopg2
|
||||
import psycopg2.extras
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def queries():
|
||||
dir_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), "blogdb", "sql")
|
||||
return anosql.from_path(dir_path, "psycopg2")
|
||||
|
||||
|
||||
def test_record_query(pg_conn, queries):
|
||||
dsn = pg_conn.get_dsn_parameters()
|
||||
with psycopg2.connect(cursor_factory=psycopg2.extras.RealDictCursor, **dsn) as conn:
|
||||
actual = queries.users.get_all(conn)
|
||||
|
||||
assert len(actual) == 3
|
||||
assert actual[0] == {
|
||||
"userid": 1,
|
||||
"username": "bobsmith",
|
||||
"firstname": "Bob",
|
||||
"lastname": "Smith",
|
||||
}
|
||||
|
||||
|
||||
def test_parameterized_query(pg_conn, queries):
|
||||
actual = queries.blogs.get_user_blogs(pg_conn, userid=1)
|
||||
expected = [("How to make a pie.", date(2018, 11, 23)), ("What I did Today", date(2017, 7, 28))]
|
||||
assert actual == expected
|
||||
|
||||
|
||||
def test_parameterized_record_query(pg_conn, queries):
|
||||
dsn = pg_conn.get_dsn_parameters()
|
||||
with psycopg2.connect(cursor_factory=psycopg2.extras.RealDictCursor, **dsn) as conn:
|
||||
actual = queries.blogs.pg_get_blogs_published_after(conn, published=date(2018, 1, 1))
|
||||
|
||||
expected = [
|
||||
{"title": "How to make a pie.", "username": "bobsmith", "published": "2018-11-23 00:00"},
|
||||
{"title": "Testing", "username": "janedoe", "published": "2018-01-01 00:00"},
|
||||
]
|
||||
|
||||
assert actual == expected
|
||||
|
||||
|
||||
def test_select_cursor_context_manager(pg_conn, queries):
|
||||
with queries.blogs.get_user_blogs_cursor(pg_conn, userid=1) as cursor:
|
||||
actual = cursor.fetchall()
|
||||
expected = [
|
||||
("How to make a pie.", date(2018, 11, 23)),
|
||||
("What I did Today", date(2017, 7, 28)),
|
||||
]
|
||||
assert actual == expected
|
||||
|
||||
|
||||
def test_insert_returning(pg_conn, queries):
|
||||
with pg_conn:
|
||||
blogid, title = queries.blogs.pg_publish_blog(
|
||||
pg_conn,
|
||||
userid=2,
|
||||
title="My first blog",
|
||||
content="Hello, World!",
|
||||
published=date(2018, 12, 4),
|
||||
)
|
||||
with pg_conn.cursor() as cur:
|
||||
cur.execute(
|
||||
"""\
|
||||
select blogid,
|
||||
title
|
||||
from blogs
|
||||
where blogid = %s;
|
||||
""",
|
||||
(blogid,),
|
||||
)
|
||||
expected = cur.fetchone()
|
||||
|
||||
assert (blogid, title) == expected
|
||||
|
||||
|
||||
def test_delete(pg_conn, queries):
|
||||
# Removing the "janedoe" blog titled "Testing"
|
||||
actual = queries.blogs.remove_blog(pg_conn, blogid=2)
|
||||
assert actual is None
|
||||
|
||||
janes_blogs = queries.blogs.get_user_blogs(pg_conn, userid=3)
|
||||
assert len(janes_blogs) == 0
|
||||
|
||||
|
||||
def test_insert_many(pg_conn, queries):
|
||||
blogs = [
|
||||
{
|
||||
"userid": 2,
|
||||
"title": "Blog Part 1",
|
||||
"content": "content - 1",
|
||||
"published": date(2018, 12, 4),
|
||||
},
|
||||
{
|
||||
"userid": 2,
|
||||
"title": "Blog Part 2",
|
||||
"content": "content - 2",
|
||||
"published": date(2018, 12, 5),
|
||||
},
|
||||
{
|
||||
"userid": 2,
|
||||
"title": "Blog Part 3",
|
||||
"content": "content - 3",
|
||||
"published": date(2018, 12, 6),
|
||||
},
|
||||
]
|
||||
|
||||
with pg_conn:
|
||||
actual = queries.blogs.pg_bulk_publish(pg_conn, blogs)
|
||||
assert actual is None
|
||||
|
||||
johns_blogs = queries.blogs.get_user_blogs(pg_conn, userid=2)
|
||||
assert johns_blogs == [
|
||||
("Blog Part 3", date(2018, 12, 6)),
|
||||
("Blog Part 2", date(2018, 12, 5)),
|
||||
("Blog Part 1", date(2018, 12, 4)),
|
||||
]
|
215
projects/anosql/test/python/test_simple.py
Normal file
215
projects/anosql/test/python/test_simple.py
Normal file
|
@ -0,0 +1,215 @@
|
|||
import pytest
|
||||
|
||||
import anosql
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def sqlite(request):
|
||||
import sqlite3
|
||||
sqlconnection = sqlite3.connect(':memory:')
|
||||
|
||||
def fin():
|
||||
"teardown"
|
||||
print("teardown")
|
||||
sqlconnection.close()
|
||||
|
||||
request.addfinalizer(fin)
|
||||
|
||||
return sqlconnection
|
||||
|
||||
|
||||
def test_simple_query(sqlite):
|
||||
_test_create_insert = ("-- name: create-some-table#\n"
|
||||
"-- testing insertion\n"
|
||||
"CREATE TABLE foo (a, b, c);\n\n"
|
||||
"-- name: insert-some-value!\n"
|
||||
"INSERT INTO foo (a, b, c) VALUES (1, 2, 3);\n")
|
||||
|
||||
q = anosql.from_str(_test_create_insert, "sqlite3")
|
||||
q.create_some_table(sqlite)
|
||||
q.insert_some_value(sqlite)
|
||||
|
||||
|
||||
def test_auto_insert_query(sqlite):
|
||||
_test_create_insert = ("-- name: create-some-table#\n"
|
||||
"-- testing insertion\n"
|
||||
"CREATE TABLE foo (a, b, c);\n\n"
|
||||
"-- name: insert-some-value<!\n"
|
||||
"INSERT INTO foo (a, b, c) VALUES (1, 2, 3);\n")
|
||||
|
||||
q = anosql.from_str(_test_create_insert, "sqlite3")
|
||||
q.create_some_table(sqlite)
|
||||
assert q.insert_some_value(sqlite) == 1
|
||||
assert q.insert_some_value(sqlite) == 2
|
||||
assert q.insert_some_value(sqlite) == 3
|
||||
|
||||
|
||||
def test_parametrized_insert(sqlite):
|
||||
_test_create_insert = ("-- name: create-some-table#\n"
|
||||
"-- testing insertion\n"
|
||||
"CREATE TABLE foo (a, b, c);\n\n"
|
||||
"-- name: insert-some-value!\n"
|
||||
"INSERT INTO foo (a, b, c) VALUES (?, ?, ?);\n\n"
|
||||
"-- name: get-all-values\n"
|
||||
"SELECT * FROM foo;\n")
|
||||
|
||||
q = anosql.from_str(_test_create_insert, "sqlite3")
|
||||
q.create_some_table(sqlite)
|
||||
q.insert_some_value(sqlite, 10, 11, 12)
|
||||
assert q.get_all_values(sqlite) == [(10, 11, 12)]
|
||||
|
||||
|
||||
def test_parametrized_insert_named(sqlite):
|
||||
_test_create_insert = ("-- name: create-some-table#\n"
|
||||
"-- testing insertion\n"
|
||||
"CREATE TABLE foo (a, b, c);\n\n"
|
||||
"-- name: insert-some-value!\n"
|
||||
"INSERT INTO foo (a, b, c) VALUES (:a, :b, :c);\n\n"
|
||||
"-- name: get-all-values\n"
|
||||
"SELECT * FROM foo;\n")
|
||||
|
||||
q = anosql.from_str(_test_create_insert, "sqlite3")
|
||||
q.create_some_table(sqlite)
|
||||
q.insert_some_value(sqlite, c=12, b=11, a=10)
|
||||
assert q.get_all_values(sqlite) == [(10, 11, 12)]
|
||||
|
||||
|
||||
def test_one_row(sqlite):
|
||||
_test_one_row = ("-- name: one-row?\n"
|
||||
"SELECT 1, 'hello';\n\n"
|
||||
"-- name: two-rows?\n"
|
||||
"SELECT 1 UNION SELECT 2;\n")
|
||||
q = anosql.from_str(_test_one_row, "sqlite3")
|
||||
assert q.one_row(sqlite) == (1, 'hello')
|
||||
assert q.two_rows(sqlite) is None
|
||||
|
||||
|
||||
def test_simple_query_pg(postgresql):
|
||||
_queries = ("-- name: create-some-table#\n"
|
||||
"-- testing insertion\n"
|
||||
"CREATE TABLE foo (id serial primary key, a int, b int, c int);\n\n"
|
||||
"-- name: insert-some-value!\n"
|
||||
"INSERT INTO foo (a, b, c) VALUES (1, 2, 3);\n\n"
|
||||
"-- name: get-all-values\n"
|
||||
"SELECT a, b, c FROM foo;\n")
|
||||
|
||||
q = anosql.from_str(_queries, "psycopg2")
|
||||
|
||||
q.create_some_table(postgresql)
|
||||
q.insert_some_value(postgresql)
|
||||
|
||||
assert q.get_all_values(postgresql) == [(1, 2, 3)]
|
||||
|
||||
|
||||
def test_auto_insert_query_pg(postgresql):
|
||||
_queries = ("-- name: create-some-table#\n"
|
||||
"-- testing insertion\n"
|
||||
"CREATE TABLE foo (id serial primary key, a int, b int, c int);\n\n"
|
||||
"-- name: insert-some-value<!\n"
|
||||
"INSERT INTO foo (a, b, c) VALUES (1, 2, 3) returning id;\n\n"
|
||||
"-- name: get-all-values\n"
|
||||
"SELECT a, b, c FROM foo;\n")
|
||||
|
||||
q = anosql.from_str(_queries, "psycopg2")
|
||||
|
||||
q.create_some_table(postgresql)
|
||||
|
||||
assert q.insert_some_value(postgresql) == 1
|
||||
assert q.insert_some_value(postgresql) == 2
|
||||
|
||||
|
||||
def test_parameterized_insert_pg(postgresql):
|
||||
_queries = ("-- name: create-some-table#\n"
|
||||
"-- testing insertion\n"
|
||||
"CREATE TABLE foo (id serial primary key, a int, b int, c int);\n\n"
|
||||
"-- name: insert-some-value!\n"
|
||||
"INSERT INTO foo (a, b, c) VALUES (%s, %s, %s);\n\n"
|
||||
"-- name: get-all-values\n"
|
||||
"SELECT a, b, c FROM foo;\n")
|
||||
|
||||
q = anosql.from_str(_queries, "psycopg2")
|
||||
|
||||
q.create_some_table(postgresql)
|
||||
q.insert_some_value(postgresql, 1, 2, 3)
|
||||
|
||||
assert q.get_all_values(postgresql) == [(1, 2, 3)]
|
||||
|
||||
|
||||
def test_auto_parameterized_insert_query_pg(postgresql):
|
||||
_queries = ("-- name: create-some-table#\n"
|
||||
"-- testing insertion\n"
|
||||
"CREATE TABLE foo (id serial primary key, a int, b int, c int);\n\n"
|
||||
"-- name: insert-some-value<!\n"
|
||||
"INSERT INTO foo (a, b, c) VALUES (%s, %s, %s) returning id;\n\n"
|
||||
"-- name: get-all-values\n"
|
||||
"SELECT a, b, c FROM foo;\n")
|
||||
|
||||
q = anosql.from_str(_queries, "psycopg2")
|
||||
|
||||
q.create_some_table(postgresql)
|
||||
|
||||
assert q.insert_some_value(postgresql, 1, 2, 3) == 1
|
||||
assert q.get_all_values(postgresql) == [(1, 2, 3)]
|
||||
|
||||
assert q.insert_some_value(postgresql, 1, 2, 3) == 2
|
||||
|
||||
|
||||
def test_parameterized_select_pg(postgresql):
|
||||
_queries = ("-- name: create-some-table#\n"
|
||||
"-- testing insertion\n"
|
||||
"CREATE TABLE foo (id serial primary key, a int, b int, c int);\n\n"
|
||||
"-- name: insert-some-value!\n"
|
||||
"INSERT INTO foo (a, b, c) VALUES (1, 2, 3)\n\n"
|
||||
"-- name: get-all-values\n"
|
||||
"SELECT a, b, c FROM foo WHERE a = %s;\n")
|
||||
|
||||
q = anosql.from_str(_queries, "psycopg2")
|
||||
|
||||
q.create_some_table(postgresql)
|
||||
q.insert_some_value(postgresql)
|
||||
|
||||
assert q.get_all_values(postgresql, 1) == [(1, 2, 3)]
|
||||
|
||||
|
||||
def test_parameterized_insert_named_pg(postgresql):
|
||||
_queries = ("-- name: create-some-table#\n"
|
||||
"-- testing insertion\n"
|
||||
"CREATE TABLE foo (id serial primary key, a int, b int, c int);\n\n"
|
||||
"-- name: insert-some-value!\n"
|
||||
"INSERT INTO foo (a, b, c) VALUES (%(a)s, %(b)s, %(c)s)\n\n"
|
||||
"-- name: get-all-values\n"
|
||||
"SELECT a, b, c FROM foo;\n")
|
||||
|
||||
q = anosql.from_str(_queries, "psycopg2")
|
||||
|
||||
q.create_some_table(postgresql)
|
||||
q.insert_some_value(postgresql, a=1, b=2, c=3)
|
||||
|
||||
assert q.get_all_values(postgresql) == [(1, 2, 3)]
|
||||
|
||||
|
||||
def test_parameterized_select_named_pg(postgresql):
|
||||
_queries = ("-- name: create-some-table#\n"
|
||||
"-- testing insertion\n"
|
||||
"CREATE TABLE foo (id serial primary key, a int, b int, c int);\n\n"
|
||||
"-- name: insert-some-value!\n"
|
||||
"INSERT INTO foo (a, b, c) VALUES (1, 2, 3)\n\n"
|
||||
"-- name: get-all-values\n"
|
||||
"SELECT a, b, c FROM foo WHERE a = %(a)s;\n")
|
||||
|
||||
q = anosql.from_str(_queries, "psycopg2")
|
||||
|
||||
q.create_some_table(postgresql)
|
||||
q.insert_some_value(postgresql)
|
||||
|
||||
assert q.get_all_values(postgresql, a=1) == [(1, 2, 3)]
|
||||
|
||||
|
||||
def test_without_trailing_semi_colon_pg():
|
||||
"""Make sure keywords ending queries are recognized even without
|
||||
semi-colons.
|
||||
"""
|
||||
_queries = ("-- name: get-by-a\n"
|
||||
"SELECT a, b, c FROM foo WHERE a = :a\n")
|
||||
q = anosql.from_str(_queries, "psycopg2")
|
||||
assert q.get_by_a.sql == "SELECT a, b, c FROM foo WHERE a = %(a)s"
|
108
projects/anosql/test/python/test_sqlite3.py
Normal file
108
projects/anosql/test/python/test_sqlite3.py
Normal file
|
@ -0,0 +1,108 @@
|
|||
import os
|
||||
|
||||
import anosql
|
||||
import pytest
|
||||
from importlib.resources import path
|
||||
|
||||
def dict_factory(cursor, row):
|
||||
d = {}
|
||||
for idx, col in enumerate(cursor.description):
|
||||
d[col[0]] = row[idx]
|
||||
return d
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def queries():
|
||||
dir_path = path("blogdb", "sql")
|
||||
return anosql.from_path(dir_path, "sqlite3")
|
||||
|
||||
|
||||
def test_record_query(sqlite3_conn, queries):
|
||||
sqlite3_conn.row_factory = dict_factory
|
||||
actual = queries.users.get_all(sqlite3_conn)
|
||||
|
||||
assert len(actual) == 3
|
||||
assert actual[0] == {
|
||||
"userid": 1,
|
||||
"username": "bobsmith",
|
||||
"firstname": "Bob",
|
||||
"lastname": "Smith",
|
||||
}
|
||||
|
||||
|
||||
def test_parameterized_query(sqlite3_conn, queries):
|
||||
actual = queries.blogs.get_user_blogs(sqlite3_conn, userid=1)
|
||||
expected = [("How to make a pie.", "2018-11-23"), ("What I did Today", "2017-07-28")]
|
||||
assert actual == expected
|
||||
|
||||
|
||||
def test_parameterized_record_query(sqlite3_conn, queries):
|
||||
sqlite3_conn.row_factory = dict_factory
|
||||
actual = queries.blogs.sqlite_get_blogs_published_after(sqlite3_conn, published="2018-01-01")
|
||||
|
||||
expected = [
|
||||
{"title": "How to make a pie.", "username": "bobsmith", "published": "2018-11-23 00:00"},
|
||||
{"title": "Testing", "username": "janedoe", "published": "2018-01-01 00:00"},
|
||||
]
|
||||
|
||||
assert actual == expected
|
||||
|
||||
|
||||
def test_select_cursor_context_manager(sqlite3_conn, queries):
|
||||
with queries.blogs.get_user_blogs_cursor(sqlite3_conn, userid=1) as cursor:
|
||||
actual = cursor.fetchall()
|
||||
expected = [("How to make a pie.", "2018-11-23"), ("What I did Today", "2017-07-28")]
|
||||
assert actual == expected
|
||||
|
||||
|
||||
def test_insert_returning(sqlite3_conn, queries):
|
||||
with sqlite3_conn:
|
||||
blogid = queries.blogs.publish_blog(
|
||||
sqlite3_conn,
|
||||
userid=2,
|
||||
title="My first blog",
|
||||
content="Hello, World!",
|
||||
published="2018-12-04",
|
||||
)
|
||||
cur = sqlite3_conn.cursor()
|
||||
cur.execute(
|
||||
"""\
|
||||
select title
|
||||
from blogs
|
||||
where blogid = ?;
|
||||
""",
|
||||
(blogid,),
|
||||
)
|
||||
actual = cur.fetchone()
|
||||
cur.close()
|
||||
expected = ("My first blog",)
|
||||
|
||||
assert actual == expected
|
||||
|
||||
|
||||
def test_delete(sqlite3_conn, queries):
|
||||
# Removing the "janedoe" blog titled "Testing"
|
||||
actual = queries.blogs.remove_blog(sqlite3_conn, blogid=2)
|
||||
assert actual is None
|
||||
|
||||
janes_blogs = queries.blogs.get_user_blogs(sqlite3_conn, userid=3)
|
||||
assert len(janes_blogs) == 0
|
||||
|
||||
|
||||
def test_insert_many(sqlite3_conn, queries):
|
||||
blogs = [
|
||||
(2, "Blog Part 1", "content - 1", "2018-12-04"),
|
||||
(2, "Blog Part 2", "content - 2", "2018-12-05"),
|
||||
(2, "Blog Part 3", "content - 3", "2018-12-06"),
|
||||
]
|
||||
|
||||
with sqlite3_conn:
|
||||
actual = queries.blogs.sqlite_bulk_publish(sqlite3_conn, blogs)
|
||||
assert actual is None
|
||||
|
||||
johns_blogs = queries.blogs.get_user_blogs(sqlite3_conn, userid=2)
|
||||
assert johns_blogs == [
|
||||
("Blog Part 3", "2018-12-06"),
|
||||
("Blog Part 2", "2018-12-05"),
|
||||
("Blog Part 1", "2018-12-04"),
|
||||
]
|
|
@ -1,16 +1,19 @@
|
|||
aiohttp==3.7.4.post0
|
||||
alabaster==0.7.12
|
||||
anosql==1.0.2
|
||||
async-timeout==3.0.1
|
||||
attrs==20.3.0
|
||||
autoflake==1.4
|
||||
Babel==2.9.0
|
||||
beautifulsoup4==4.9.3
|
||||
bleach==4.0.0
|
||||
certifi==2020.12.5
|
||||
chardet==4.0.0
|
||||
click==7.1.2
|
||||
commonmark==0.9.1
|
||||
coverage==5.5
|
||||
docutils==0.17
|
||||
Flask==2.0.1
|
||||
hypothesis==6.14.5
|
||||
idna==2.10
|
||||
imagesize==1.2.0
|
||||
|
@ -18,14 +21,16 @@ importlib-metadata==4.0.1
|
|||
iniconfig==1.1.1
|
||||
isodate==0.6.0
|
||||
isort==5.8.0
|
||||
itsdangerous==2.0.1
|
||||
jedi==0.18.0
|
||||
Jinja2==2.11.3
|
||||
Jinja2==3.0.1
|
||||
jsonschema==3.2.0
|
||||
livereload==2.6.3
|
||||
lxml==4.6.3
|
||||
m2r==0.2.1
|
||||
MarkupSafe==1.1.1
|
||||
MarkupSafe==2.0.1
|
||||
meraki==1.7.2
|
||||
mirakuru==2.4.1
|
||||
mistune==0.8.4
|
||||
multidict==5.1.0
|
||||
mypy-extensions==0.4.3
|
||||
|
@ -34,8 +39,13 @@ openapi-spec-validator==0.3.0
|
|||
packaging==20.9
|
||||
parso==0.8.2
|
||||
pathspec==0.8.1
|
||||
pep517==0.11.0
|
||||
pip-tools==6.2.0
|
||||
pluggy==0.13.1
|
||||
port-for==0.6.1
|
||||
prompt-toolkit==3.0.18
|
||||
psutil==5.8.0
|
||||
psycopg2==2.9.1
|
||||
pudb==2020.1
|
||||
py==1.10.0
|
||||
pyflakes==2.3.1
|
||||
|
@ -44,6 +54,7 @@ pyparsing==2.4.7
|
|||
pyrsistent==0.17.3
|
||||
pytest==6.2.3
|
||||
pytest-cov==2.11.1
|
||||
pytest-postgresql==3.1.1
|
||||
pytest-pudb==0.7.0
|
||||
pytz==2021.1
|
||||
PyYAML==5.4.1
|
||||
|
@ -68,6 +79,7 @@ sphinxcontrib-programoutput==0.17
|
|||
sphinxcontrib-qthelp==1.0.3
|
||||
sphinxcontrib-serializinghtml==1.1.4
|
||||
toml==0.10.2
|
||||
tomli==1.2.1
|
||||
tornado==6.1
|
||||
typed-ast==1.4.2
|
||||
typing-extensions==3.7.4.3
|
||||
|
@ -76,6 +88,9 @@ untokenize==0.1.1
|
|||
urllib3==1.26.4
|
||||
urwid==2.1.2
|
||||
wcwidth==0.2.5
|
||||
webencodings==0.5.1
|
||||
Werkzeug==2.0.1
|
||||
yamllint==1.26.1
|
||||
yarl==1.6.3
|
||||
yaspin==1.5.0
|
||||
zipp==3.5.0
|
||||
|
|
Loading…
Reference in a new issue