Merge pull request #300 from rackerlabs/master

Promote Master to Stable
This commit is contained in:
Andrew Melton 2014-03-04 13:38:20 -05:00
commit 9468220493
44 changed files with 4547 additions and 464 deletions

1
.gitignore vendored

@ -11,3 +11,4 @@ etc/stacktach_worker_config.json
etc/stacktach_verifier_config.json
verifier.log
verifier.log.*
.gitattributes

@ -11,89 +11,5 @@ OpenStack has the ability to publish notifications to a RabbitMQ exchange as the
A detailed description of the notifications published by OpenStack [is available here](http://wiki.openstack.org/SystemUsageData)
StackTach has three primary components:
1. The Worker daemon. Consumes the notifications from the Rabbit queue and writes it to a SQL database.
1. The Web UI, which is a Django application. Provides a real-time display of notifications as they are consumed by the worker. Also provides for point-and-click analysis of the events for following related events.
1. Stacky, the command line tool. Operator and Admins aren't big fans of web interfaces. StackTach also exposes a REST interface which Stacky can use to provide output suitable for tail/grep post-processing.
## Installing StackTach
### The "Hurry Up" Install Guide
1. Create a database for StackTach to use. By default, StackTach assumes MySql, but you can modify the settings.py file to others.
1. Install django and the other required libraries listed in `./etc/pip-requires.txt` (I hope I got 'em all)
1. Clone this repo
1. Copy and configure the config files in `./etc` (see below for details)
1. Create the necessary database tables (python manage.py syncdb) You don't need an administrator account since there are no user profiles used.
1. Configure OpenStack to publish Notifications back into RabbitMQ (see below)
1. Restart the OpenStack services.
1. Run the Worker to start consuming messages. (see below)
1. Run the web server (python manage.py runserver)
1. Point your browser to `http://127.0.0.1:8000` (the default server location)
1. Click on stuff, see what happens. You can't hurt anything, it's all read-only.
Of course, this is only suitable for playing around. If you want to get serious about deploying StackTach you should set up a proper webserver and database on standalone servers. There is a lot of data that gets collected by StackTach (depending on your deployment size) ... be warned. Keep an eye on DB size.
#### The Config Files
There are two config files for StackTach. The first one tells us where the second one is. A sample of these two files is in `./etc/sample_*`. Create a local copy of these files and populate them with the appropriate config values as described below.
The `sample_stacktach_config.sh` shell script defines the necessary environment variables StackTach needs. Most of these are just information about the database (assuming MySql) but some are a little different. **Remember to source the local copy of the `sample_stacktach_config.sh` shell script to set up the necessary environment variables.**
If your db host is not on the same machine, you'll need to set this flag. Otherwise the empty string is fine.
`STACKTACH_INSTALL_DIR` should point to where StackTach is running out of. In most cases this will be your repo directory, but it could be elsewhere if your going for a proper deployment.
The StackTach worker needs to know which RabbitMQ servers to listen to. This information is stored in the deployment file. `STACKTACH_DEPLOYMENTS_FILE` should point to this json file. To learn more about the deployments file, see further down.
Finally, `DJANGO_SETTINGS_MODULE` tells Django where to get its configuration from. This should point to the `setting.py` file. You shouldn't have to do much with the `settings.py` file and most of what it needs is in these environment variables.
The `sample_stacktach_worker_config.json` file tells StackTach where each of the RabbitMQ servers are that it needs to get events from. In most cases you'll only have one entry in this file, but for large multi-cell deployments, this file can get pretty large. It's also handy for setting up one StackTach for each developer environment.
The file is in json format and the main configuration is under the `"deployments"` key, which should contain a list of deployment dictionaries.
A blank worker config file would look like this:
```
{"deployments": [] }
```
But that's not much fun. A deployment entry would look like this:
```
{"deployments": [
{
"name": "east_coast.prod.cell1",
"durable_queue": false,
"rabbit_host": "10.0.1.1",
"rabbit_port": 5672,
"rabbit_userid": "rabbit",
"rabbit_password": "rabbit",
"rabbit_virtual_host": "/"
}
]}
```
where, *name* is whatever you want to call your deployment, and *rabbit_<>* are the connectivity details for your rabbit server. It should be the same information in your `nova.conf` file that OpenStack is using. Note, json has no concept of comments, so using `#`, `//` or `/* */` as a comment won't work.
By default, Nova uses ephemeral queues. If you are using durable queues, be sure to change the necessary flag here.
You can add as many deployments as you like.
#### Starting the Worker
Note: the worker now uses librabbitmq, be sure to install that first.
`./worker/start_workers.py` will spawn a worker.py process for each deployment defined. Each worker will consume from a single Rabbit queue.
#### Configuring Nova to generate Notifications
`--notification_driver=nova.openstack.common.notifier.rabbit_notifier`
`--notification_topics=monitor`
This will tell OpenStack to publish notifications to a Rabbit exchange starting with `monitor.*` ... this may result in `monitor.info`, `monitor.error`, etc.
You'll need to restart Nova once these changes are made.
### Next Steps
Once you have this working well, you should download and install Stacky and play with the command line tool.
## Documentation
http://stacktach.readthedocs.org/

177
docs/Makefile Normal file

@ -0,0 +1,177 @@
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build
# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
endif
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " xml to make Docutils-native XML files"
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
clean:
rm -rf $(BUILDDIR)/*
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/StackTach.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/StackTach.qhc"
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/StackTach"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/StackTach"
@echo "# devhelp"
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
xml:
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
@echo
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."

808
docs/api.rst Normal file

@ -0,0 +1,808 @@
The StackTach REST Interface
############################
JSON Response Format
********************
StackTach uses an tabular JSON response format to make it easier for
Stacky to display generic results.
The JSON response format is as follows: ::
[
['column header', 'column header', 'column header', ...],
['row 1, col 1', 'row 1, col 2', 'row 1, col 3', ...],
['row 2, col 1', 'row 2, col 2', 'row 2, col 3', ...],
['row 3, col 1', 'row 3, col 2', 'row 3, col 3', ...],
...
]
stacky/deployments
==================
.. http:get:: http://example.com/stacky/deployments/
The list of all available deployments
**Example request**:
.. sourcecode:: http
GET /stacky/deployments/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
['#', 'Name'],
[1, 'deployment name'],
[2, 'deployment name'],
...
]
stacky/events
=============
.. http:get:: http://example.com/stacky/events/
The distinct list of all event names
**Example request**:
.. sourcecode:: http
GET /stacky/events/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
['Event Name'],
["add_fixed_ip_to_instance"],
["attach_volume"],
["change_instance_metadata"],
["compute.instance.create.end"],
["compute.instance.create.error"],
["compute.instance.create.start"],
["compute.instance.create_ip.end"],
...
]
:query service: ``nova`` or ``glance``. default="nova"
stacky/hosts
============
.. http:get:: http://example.com/stacky/hosts/
The distinct list of all hosts sending notifications.
**Example request**:
.. sourcecode:: http
GET /stacky/hosts/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
['Host Name'],
["compute-1"],
["compute-2"],
["scheduler-x"],
["api-88"],
...
:query service: ``nova`` or ``glance``. default="nova"
]
stacky/uuid
===========
.. http:get:: http://example.com/stacky/uuid/
Retrieve all notifications for instances with a given UUID.
**Example request**:
.. sourcecode:: http
GET /stacky/uuid/?uuid=77e0f192-00a2-4f14-ad56-7467897828ea HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
["#", "?", "When", "Deployment", "Event", "Host", "State",
"State'", "Task"],
[
40065869,
" ",
"2014-01-14 15:39:22.574829",
"region-1",
"compute.instance.snapshot.start",
"compute-99",
"active",
"",
""
],
[
40065879,
" ",
"2014-01-14 15:39:23.599298",
"region-1",
"compute.instance.update",
"compute-99",
"active",
"active",
"image_snapshot"
],
...
]
:query uuid: UUID of desired instance.
:query service: ``nova`` or ``glance``. default="nova"
stacky/timings/uuid/
====================
.. http:get:: http://example.com/stacky/timings/uuid/
Retrieve all timings for a given instance. Timings are the time
deltas between related .start and .end notifications. For example,
the time difference between ``compute.instance.run_instance.start``
and ``compute.instance.run_instance.end``.
The first column of the response will be
* ``S`` if there is a ``.start`` event and no ``.end``
* ``E`` if there is a ``.end`` event and no ``.start``
* ``.`` if there was a ``.start`` and ``.end`` event
No time difference will be returned in the ``S`` or ``E`` cases.
**Example request**:
.. sourcecode:: http
GET /stacky/timings/uuid/?uuid=77e0f192-00a2-4f14-ad56-7467897828ea HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
["?", "Event", "Time (secs)"],
[".", "compute.instance.create", "0d 00:00:55.50"],
[".", "compute.instance.snapshot", "0d 00:14:11.71"],
[".", "compute.instance.snapshot", "0d 00:17:31.33"],
[".", "compute.instance.snapshot", "0d 00:16:48.88"]
...
]
:query uuid: UUID of desired instance.
:query service: ``nova`` or ``glance``. default="nova"
stacky/summary
==============
.. http:get:: http://example.com/stacky/summary/
Returns timing summary information for each event type
collected. Only notifications with ``.start``/``.end`` pairs
are considered.
This includes: ::
* the number of events seen of each type (N)
* the Minimum time seen
* the Maximum time seen
* the Average time seen
**Example request**:
.. sourcecode:: http
GET /stacky/summary/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
["Event", "N", "Min", "Max", "Avg"],
["compute.instance.create", 50,
"0d 00:00:52.88", "0d 01:41:14.27", "0d 00:08:26"],
["compute.instance.create_ip", 50,
"0d 00:00:06.80", "5d 20:16:47.08", "0d 03:47:17"],
...
]
:query uuid: UUID of desired instance.
:query service: ``nova`` or ``glance``. default="nova"
:query limit: the number of timings to return.
:query offset: offset into query result set to start from.
stacky/request
==============
.. http:get:: http://example.com/stacky/request/
Returns all notifications related to a particular Request ID.
The ``?`` column will be ``E`` if the event came from the ``.error``
queue. ``State`` and ``State'`` are the current state and the previous
state, respectively.
**Example request**:
.. sourcecode:: http
GET /stacky/request/?request_id=req-a7517402-6192-4d0a-85a1-e14051790d5a HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
["#", "?", "When", "Deployment", "Event", "Host", "State",
"State'", "Task'"
],
[
40368306,
" ",
"2014-01-15 15:39:34.130286",
"region-1",
"compute.instance.update",
"api-1",
"active",
"active",
null
],
[
40368308,
" ",
"2014-01-15 15:39:34.552434",
"region-1",
"compute.instance.update",
"api-1",
"active",
null,
null
],
...
]
:query request_id: desired request ID
:query when_min: unixtime to start search
:query when_max: unixtime to end search
:query limit: the number of timings to return.
:query offset: offset into query result set to start from.
stacky/reports
==============
.. http:get:: http://example.com/stacky/reports/
Returns a list of all available reports.
The ``Start`` and ``End`` columns refer to the time span
the report covers (in unixtime).
**Example request**:
.. sourcecode:: http
GET /stacky/reports/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
["Id", "Start", "End", "Created", "Name", "Version"],
[
5971,
1389726000.0,
1389729599.0,
1389730212.9474499,
"summary for region: all",
4
],
[
5972,
1389729600.0,
1389733199.0,
1389733809.979934,
"summary for region: all",
4
],
...
]
:query created_from: unixtime to start search
:query created_to: unixtime to end search
:query limit: the number of timings to return.
:query offset: offset into query result set to start from.
stacky/report/<report_id>
=========================
.. http:get:: http://example.com/stacky/report/<report_id>
Returns a specific report.
The contents of the report varies by the specific report, but
all are in row/column format with Row 0 being a special *metadata* row.
Row 0 of each report is a dictionary of metadata about the report. The
actual row/columns of the report start at Row 1 onwards (where Row 1
is the Column headers and Rows 2+ are the details, as with other result
sets)
**Example request**:
.. sourcecode:: http
GET /stacky/report/1/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
{
"4xx failure count": 0,
"4xx failure percentage": 0.0,
"5xx failure count": 1,
"5xx failure percentage": 0.018284904,
"> 30 failure count": 13,
"> 30 failure percentage": 1.13479794,
"cells": [
"c0001",
"global",
"c0003",
"c0004",
"c0011",
"c0010",
"a0001",
"c0012",
"b0002",
"a0002"
],
"end": 1389729599.0,
"failure_grand_rate": 0.2445074415308293,
"failure_grand_total": 14,
"hours": 1,
"pct": 0.014999999999999999,
"percentile": 97,
"region": null,
"start": 1389726000.0,
"state failure count": 0,
"state failure percentage": 0.0,
"total": 411
},
["Operation", "Image", "OS Type", "Min", "Max", "Med", "97%", "Requests",
"4xx", "% 4xx", "5xx", "% 5xx", "> 30", "% > 30", "state", "% state"],
[
"aux",
"snap",
"windows",
"0s",
"5s",
"0s",
"5s",
6,
0,
0.0,
0,
0.0,
0,
0.0,
0,
0.0
],
[
"resize",
"base",
"linux",
"1s",
"5:44s",
"1:05s",
"3:44s",
9,
0,
0.0,
0,
0.0,
0,
0.0,
0,
0.0
],
...
]
stacky/reports/search/
=========================
.. http:get:: http://example.com/stacky/reports/search
Returns reports that match the search criteria in descending order of id.
The contents of the report varies by the specific report, but
all are in row/column format with Row 0 being a special *metadata* row.
The actual row/columns of the report start at Row 1 onwards.
**Example request**:
.. sourcecode:: http
GET /stacky/reports/search/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
[
"Id",
"Start",
"End",
"Created",
"Name",
"Version"
],
[
4253,
"2013-11-21 00:00:00",
"2013-11-22 00:00:00",
"2013-11-22 01:44:55",
"public outbound bandwidth",
1
],
[
4252,
"2014-01-18 00:00:00",
"2013-11-22 00:00:00",
"2013-11-22 01:44:55",
"image events audit",
1
],
[
4248,
"2013-11-21 00:00:00",
"2013-11-22 00:00:00",
"2013-11-22 01:44:55",
"Error detail report",
1
],
...
]
:query id: integer report id
:query name: string report name(can include spaces)
:query period_start: start of period, which the report pertains to, in the following format: YYYY-MM-DD HH:MM[:ss[.uuuuuu]][TZ]
:query period_end: end of period, which the report pertains to, in the following format: YYYY-MM-DD HH:MM[:ss[.uuuuuu]][TZ]
:query created: the day, when the report was created, in the following format: YYYY-MM-DD
stacky/show/<event_id>
======================
.. http:get:: http://example.com/stacky/show/<event_id>/
Show the details on a specific notification.
The response of this operation is non-standard. It returns 3 rows:
* The first row is the traditional row-column result set used by most
commands.
* The second row is a prettied, stringified version of the full JSON payload
of the raw notification.
* The third row is the UUID of the related instance, if any.
**Example request**:
.. sourcecode:: http
GET /stacky/show/1234/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
[
["Key", "Value"],
["#", 1234 ],
["When", "2014-01-15 20:39:44.277745"],
["Deployment", "region-1"],
["Category", "monitor.info"],
["Publisher", "compute-1"],
["State", "active"],
["Event", "compute.instance.update"],
["Service", "compute"],
["Host", "compute-1"],
["UUID", "8eba1a6d-43eb-1343-8d1a-5e596f5233b5"],
["Req ID", "req-1368539d-f645-4d96-842e-03b5c5c9dc8c"],
...
],
"[\n \"monitor.info\", \n {\n \"_context_request_id\": \"req-13685e9d-f645-4d96-842e-03b5c5c9dc8c\", \n \"_context_quota_class\": null, \n \"event_type\": \"compute.instance.update\", \n \"_context_service_catalog\": [], \n \"_context_auth_token\": \"d81a25d03bb340bb82b4b67d105cc42d\", \n \"_context_user_id\": \"b83e2fac644c4215bc449fb4b5c9bbfa\", \n \"payload\": {\n \"state_description\": \"\", \n \"availability_zone\": null, \n \"terminated_at\": \"\", \n \"ephemeral_gb\": 300, \n ...",
"8eba1a6d-43eb-1343-8d1a-5e596f5233b5"
]
:query service: ``nova`` or ``glance``. default="nova"
:query event_id: desired Event ID
stacky/watch/<deployment_id>
============================
.. http:get:: http://example.com/stacky/watch/<deployment_id>/
Get a real-time feed of notifications.
Once again, this is a non-standard response (not the typical row-column format).
This call returns a tuple of information:
* A list of column widths, to be used as a hint for formatting.
* A list of events that meet the query criteria.
* the db id of the event
* the type of event (``E`` for errors, ``.`` otherwise)
* stringified date of the event
* stringified time of the event
* deployment name
* the event name
* the instance UUID, if available
* The ending unixtime timestamp. The last time covered by this query
(utcnow, essentially)
**Example request**:
.. sourcecode:: http
GET /stacky/watch/14/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
[10, 1, 15, 20, 50, 36],
[
... events ...
]
"1389892207"
]
:query service: ``nova`` or ``glance``. default="nova"
:query since: get all events since ``unixtime``. Defaults to 2 seconds ago.
:query event_name: only watch for ``event_name`` notifications. Defaults to all events.
stacky/search
=============
.. http:get:: http://example.com/stacky/search/
Search for notifications.
Returns:
* Event ID
* ``E`` for errors, ``.`` otherwise
* unixtime for when the event was generated
* the deployment name
* the event name
* the host name
* the instance UUID
* the request ID
**Example request**:
.. sourcecode:: http
GET /stacky/search/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
[...event info as listed above...]
]
:query service: ``nova`` or ``glance``. default="nova"
:query field: notification field to search on.
:query value: notification values to find.
:query when_min: unixtime to start search
:query when_max: unixtime to end search
stacky/usage/launches
=====================
.. http:get:: http://example.com/stacky/launches/
Return a list of all instance launches.
**Example request**:
.. sourcecode:: http
GET /stacky/usages/launches/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
["UUID", "Launched At", "Instance Type Id", "Instance Flavor Id"],
[
... usage launch records ...
]
]
:query instance: desired instance UUID (optional)
stacky/usage/deletes
====================
.. http:get:: http://example.com/stacky/deletes/
Return a list of all instance deletes.
**Example request**:
.. sourcecode:: http
GET /stacky/usages/deletes/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
["UUID", "Launched At", "Deleted At"]
[
... usage deleted records ...
]
]
:query instance: desired instance UUID (optional)
stacky/usage/exists
===================
.. http:get:: http://example.com/stacky/exists/
Return a list of all instance exists notifications.
**Example request**:
.. sourcecode:: http
GET /stacky/usages/exists/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
[
["UUID", "Launched At", "Deleted At", "Instance Type Id",
"Instance Flavor Id", "Message ID", "Status"]
[
... usage exists records ...
]
]
:query instance: desired instance UUID (optional)

258
docs/conf.py Normal file

@ -0,0 +1,258 @@
# -*- coding: utf-8 -*-
#
# StackTach documentation build configuration file, created by
# sphinx-quickstart on Tue Jan 14 14:34:29 2014.
#
# This file is execfile()d with the current directory set to its
# containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.
import sys
import os
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#sys.path.insert(0, os.path.abspath('.'))
# -- General configuration ------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = ['sphinxcontrib.httpdomain']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix of source filenames.
source_suffix = '.rst'
# The encoding of source files.
#source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'StackTach'
copyright = u'2014, Sandy Walsh'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = '1.0'
# The full version, including alpha/beta/rc tags.
release = '1.0'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#language = None
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#today = ''
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = ['_build']
# The reST default role (used for this markup: `text`) to use for all
# documents.
#default_role = None
# If true, '()' will be appended to :func: etc. cross-reference text.
#add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#add_module_names = True
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
#show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# A list of ignored prefixes for module index sorting.
#modindex_common_prefix = []
# If true, keep warnings as "system message" paragraphs in the built documents.
#keep_warnings = False
# -- Options for HTML output ----------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = 'default'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []
# The name for this set of Sphinx documents. If None, it defaults to
# "<project> v<release> documentation".
#html_title = None
# A shorter title for the navigation bar. Default is the same as html_title.
#html_short_title = None
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#html_logo = None
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#html_favicon = None
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
# directly to the root of the documentation.
#html_extra_path = []
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
#html_last_updated_fmt = '%b %d, %Y'
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
#html_use_smartypants = True
# Custom sidebar templates, maps document names to template names.
#html_sidebars = {}
# Additional templates that should be rendered to pages, maps page names to
# template names.
#html_additional_pages = {}
# If false, no module index is generated.
#html_domain_indices = True
# If false, no index is generated.
#html_use_index = True
# If true, the index is split into individual pages for each letter.
#html_split_index = False
# If true, links to the reST sources are added to the pages.
#html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
#html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
#html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a <link> tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
#html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
#html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = 'StackTachdoc'
# -- Options for LaTeX output ---------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#'preamble': '',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
('index', 'StackTach.tex', u'StackTach Documentation',
u'Sandy Walsh', 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
#latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#latex_use_parts = False
# If true, show page references after internal links.
#latex_show_pagerefs = False
# If true, show URL addresses after external links.
#latex_show_urls = False
# Documents to append as an appendix to all manuals.
#latex_appendices = []
# If false, no module index is generated.
#latex_domain_indices = True
# -- Options for manual page output ---------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
('index', 'stacktach', u'StackTach Documentation',
[u'Sandy Walsh'], 1)
]
# If true, show URL addresses after external links.
#man_show_urls = False
# -- Options for Texinfo output -------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
('index', 'StackTach', u'StackTach Documentation',
u'Sandy Walsh', 'StackTach', 'One line description of project.',
'Miscellaneous'),
]
# Documents to append as an appendix to all manuals.
#texinfo_appendices = []
# If false, no module index is generated.
#texinfo_domain_indices = True
# How to display URL addresses: 'footnote', 'no', or 'inline'.
#texinfo_show_urls = 'footnote'
# If true, do not generate a @detailmenu in the "Top" node's menu.
#texinfo_no_detailmenu = False

951
docs/dbapi.rst Normal file

@ -0,0 +1,951 @@
The StackTach Database REST Interface
############################
JSON Response Format
********************
The StackTach Database API uses a more standard data model for access to database objects. The Database API is read only, with the exception of usage confirmation, which is used to indicate that usage has been sent downstream.
The JSON response format uses an envelope with a single key to indicate the type of object returned. This object can be either a dictionary in the case of queries that return single objects, or a list when multiple objects are turned.
Sample JSON response, single object: ::
{
"enitity":
{
"id": 1
"key1": "value1",
"key2": "value2"
}
}
Sample JSON response, multiple objects: ::
{
"enitities":
[
{
"id": 1,
"key1": "value1",
"key2": "value2"
},
{
"id": 2,
"key1": "value1",
"key2": "value2"
}
]
}
Write APIs
**********
db/confirm/usage/exists/batch/
==============================
.. http:put:: http://example.com/db/confirm/usage/exists/batch/
Uses the provided message_id's and http status codes to update image and instance exists send_status values.
**Example V0 request**:
.. sourcecode:: http
PUT db/confirm/usage/exists/batch/ HTTP/1.1
Host: example.com
Accept: application/json
{
"messages":
[
{"nova_message_id": 200},
{"nova_message_id": 400}
]
}
**Example V1 request**:
.. sourcecode:: http
PUT db/confirm/usage/exists/batch/ HTTP/1.1
Host: example.com
Accept: application/json
{
"messages":
[
{
"nova":
[
{"nova_message_id1": 200},
{"nova_message_id2": 400}
],
"glance":
[
{"glance_message_id1": 200},
{"glance_message_id2": 400}
]
}
]
"version": 1
}
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
Read APIs
*********
db/stats/events
===============
.. http:get:: http://example.com/db/stats/events/
Returns a count of events stored in Stacktach's Rawdata tables from
``when_min`` to ``when_max``
**Query Parameters**
* ``event``: event type to filter by
* ``when_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``when_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``service``: ``nova`` or ``glance``. default="nova"
**Example request**:
.. sourcecode:: http
GET db/stats/events/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
count: 10
}
db/stats/nova/exists/
=====================
.. http:get:: http://example.com/db/stats/nova/exists
Returns a list of status combinations and count of events with those status combinations.
Note: Only status combinations with >0 count will show up.
**Query Parameters**
* ``audit_period_beginning_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_beginning_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_ending_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_ending_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``launched_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``launched_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``deleted_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``deleted_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``received_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``received_max``: datetime (yyyy-mm-dd hh:mm:ss)
**Example request**:
.. sourcecode:: http
GET /db/stats/nova/exists/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"stats":
[
{"status": "pending", "send_status": 0, "event_count": 1},
{"status": "verified", "send_status": 200, "event_count": 100},
{"status": "reconciled", "send_status": 200, "event_count": 2},
{"status": "failed", "send_status": 0, "event_count": 1},
]
}
db/stats/glance/exists/
=======================
.. http:get:: http://example.com/db/status/usage/glance/exists
Returns a list of status combinations and count of events with those status combinations.
Note: Only status combinations with >0 count will show up.
**Query Parameters**
* ``audit_period_beginning_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_beginning_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_ending_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_ending_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``created_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``created_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``deleted_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``deleted_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``received_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``received_max``: datetime (yyyy-mm-dd hh:mm:ss)
**Example request**:
.. sourcecode:: http
GET /db/stats/nova/exists/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"stats":
[
{"status": "verified", "send_status": 200, "event_count": 200},
{"status": "failed", "send_status": 0, "event_count": 2},
]
}
db/usage/launches/
==================
.. http:get:: http://example.com/db/usage/launches/
Deprecated, see: :ref:`dbapi-nova-launches`
.. _dbapi-nova-launches:
db/usage/nova/launches/
=======================
.. http:get:: http://example.com/db/usage/nova/launches/
Returns a list of instance launches matching provided query criteria.
**Query Parameters**
* ``launched_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``launched_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``instance``: uuid
* ``limit``: int, default: 50, max: 1000
* ``offset``: int, default: 0
**Example request**:
.. sourcecode:: http
GET /db/usage/nova/launches/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"launches":
[
{
"os_distro": "org.centos",
"os_version": "5.8",
"instance_flavor_id": "2",
"instance_type_id": "2",
"launched_at": "2014-01-17 15:35:44",
"instance": "72e4d8e8-9f63-47cb-a904-0193e5edac6e",
"os_architecture": "x64",
"request_id": "req-7a86ed49-e1f4-4403-b3ef-22636f7acb7d",
"rax_options": "0",
"id": 91899,
"tenant": "5853600"
},
{
"os_distro": "org.centos",
"os_version": "5.8",
"instance_flavor_id": "performance1-4",
"instance_type_id": "11",
"launched_at": "2014-01-17 15:35:20",
"instance": "932bcfd9-af68-4261-805e-6e43156c3b40",
"os_architecture": "x64",
"request_id": "req-6bfe911f-40f2-4fd8-946a-070c10bed014",
"rax_options": "0",
"id": 91898,
"tenant": "5853595"
}
]
}
db/usage/glance/images/
=======================
.. http:get:: http://example.com/db/usage/glance/images/
Returns a list of images matching provided query criteria.
**Query Parameters**
* ``created_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``created_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``limit``: int, default: 50, max: 1000
* ``offset``: int, default: 0
**Example request**:
.. sourcecode:: http
GET /db/usage/glance/images/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"images":
[
{
"uuid": "2048efd8-fdce-4123-bdbc-add3bfe64b83",
"created_at": "2014-01-17 02:28:08",
"owner": null,
"last_raw": 299977,
"id": 4837,
"size": 9192352
},
{
"uuid": "aa2c07dd-fd1c-4ad3-9f73-6a6d7d8a0dbd",
"created_at": "2014-01-17 02:24:18",
"owner": "5937488",
"last_raw": 299967,
"id": 4836,
"size": 9
}
]
}
db/usage/launches/<launch_id>/
==============================
.. http:get:: http://example.com/db/usage/launches/<launch_id>/
Deprecated, see: :ref:`dbapi-nova-launch`
.. _dbapi-nova-launch:
db/usage/nova/launches/<launch_id>/
===================================
.. http:get:: http://example.com/db/usage/nova/launches/<launch_id>/
Returns the single launch with id matching the provided id.
**Example request**:
.. sourcecode:: http
GET /db/usage/nova/launches/91898/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"launch":
{
"os_distro": "org.centos",
"os_version": "5.8",
"instance_flavor_id": "performance1-4",
"instance_type_id": "11",
"launched_at": "2014-01-17 15:35:20",
"instance": "932bcfd9-af68-4261-805e-6e43156c3b40",
"os_architecture": "x64",
"request_id": "req-6bfe911f-40f2-4fd8-946a-070c10bed014",
"rax_options": "0",
"id": 91898,
"tenant": "5853595"
}
}
db/usage/glance/images/<image_id>/
==================================
.. http:get:: http://example.com/db/usage/glance/images/<image_id>/
Returns the single image with id matching the provided id.
**Example request**:
.. sourcecode:: http
GET /db/usage/glance/images/4836/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"launch":
{
"uuid": "aa2c07dd-fd1c-4ad3-9f73-6a6d7d8a0dbd",
"created_at": "2014-01-17 02:24:18",
"owner": "5937488",
"last_raw": 299967,
"id": 4836,
"size": 9
}
}
db/usage/deletes/
=================
.. http:get:: http://example.com/db/usage/deletes/
Deprecated, see: :ref:`dbapi-nova-deletes`
.. _dbapi-nova-deletes:
db/usage/nova/deletes/
======================
.. http:get:: http://example.com/db/usage/nova/deletes/
Returns a list of instance deletes matching provided query criteria.
**Query Parameters**
* ``launched_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``launched_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``deleted_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``deleted_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``instance``: uuid
* ``limit``: int, default: 50, max: 1000
* ``offset``: int, default: 0
**Example request**:
.. sourcecode:: http
GET /db/usage/nova/deletes/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"deletes":
[
{
"raw": 14615347,
"instance": "b36a8c2d-af88-4371-b14c-14dadf7073e5",
"deleted_at": "2014-01-17 16:07:30",
"id": 65110,
"launched_at": "2014-01-17 16:06:54"
},
{
"raw": 14615248,
"instance": "3fd6797d-bc35-42d9-ad85-157a2ea93023",
"deleted_at": "2014-01-17 16:05:23",
"id": 65108,
"launched_at": "2014-01-17 16:05:00"
}
]
}
db/usage/glance/deletes/
========================
.. http:get:: http://example.com/db/usage/glance/deletes/
Returns a list of image deletes matching provided query criteria.
**Query Parameters**
* ``deleted_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``deleted_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``limit``: int, default: 50, max: 1000
* ``offset``: int, default: 0
**Example request**:
.. sourcecode:: http
GET /db/usage/glance/deletes/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"deletes":
[
{
"raw": 300523,
"deleted_at": "2014-01-17 15:28:18.154927",
"id": 3169,
"uuid": "f8b02f0e-b392-40f5-9d39-0458ae6ebfb3"
},
{
"raw": 300512,
"deleted_at": "2014-01-17 14:28:20.544617",
"id": 3168,
"uuid": "4c9dc0be-856b-4e98-81a5-1b63df108e7d"
}
]
}
db/usage/deletes/<delete_id>/
=============================
.. http:get:: http://example.com/db/usage/deletes/
Deprecated, see: :ref:`dbapi-nova-delete`
.. _dbapi-nova-delete:
db/usage/nova/deletes/<delete_id>/
==================================
.. http:get:: http://example.com/db/usage/nova/deletes/<deleted_id>
Returns the single instance delete with id matching the provided id.
**Example request**:
.. sourcecode:: http
GET /db/usage/nova/deletes/65110/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"delete":
{
"raw": 14615347,
"instance": "b36a8c2d-af88-4371-b14c-14dadf7073e5",
"deleted_at": "2014-01-17 16:07:30",
"id": 65110,
"launched_at": "2014-01-17 16:06:54"
}
}
db/usage/glance/deletes/<delete_id>/
====================================
.. http:get:: http://example.com/db/usage/glance/deletes/<deleted_id>
Returns the single image delete with id matching the provided id.
**Example request**:
.. sourcecode:: http
GET /db/usage/glance/deletes/3168/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"delete":
{
"raw": 300512,
"deleted_at": "2014-01-17 14:28:20.544617",
"id": 3168,
"uuid": "4c9dc0be-856b-4e98-81a5-1b63df108e7d"
}
}
db/usage/exists/
================
.. http:get:: http://example.com/db/usage/exists/
Deprecated, see: :ref:`dbapi-nova-exists`
.. _dbapi-nova-exists:
db/usage/nova/exists/
=====================
.. http:get:: http://example.com/db/usage/nova/exists/
Returns a list of instance exists matching provided query criteria.
**Query Parameters**
* ``audit_period_beginning_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_beginning_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_ending_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_ending_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``launched_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``launched_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``deleted_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``deleted_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``received_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``received_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``instance``: uuid
* ``limit``: int, default: 50, max: 1000
* ``offset``: int, default: 0
**Example request**:
.. sourcecode:: http
GET /db/usage/nova/exists/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"exists":
[
{
"status": "verified",
"os_distro": "org.centos",
"bandwidth_public_out": 0,
"received": "2014-01-17 16:16:43.695474",
"instance_type_id": "2",
"raw": 14615544,
"os_architecture": "x64",
"rax_options": "0",
"audit_period_ending": "2014-01-17 16:16:43",
"deleted_at": null,
"id": 135106,
"tenant": "5889124",
"audit_period_beginning": "2014-01-17 00:00:00",
"fail_reason": null,
"instance": "978b32ea-374b-48c6-814b-bb6151e2fb5c",
"instance_flavor_id": "2",
"launched_at": "2014-01-17 16:16:09",
"os_version": "6.0",
"usage": 91932,
"send_status": 201,
"message_id": "9d28fa15-d163-40c7-8195-2853ad13179b",
"delete": null
},
{
"status": "verified",
"os_distro": "org.centos",
"bandwidth_public_out": 0,
"received": "2014-01-17 16:10:42.112505",
"instance_type_id": "2",
"raw": 14615459,
"os_architecture": "x64",
"rax_options": "0",
"audit_period_ending": "2014-01-17 16:10:42",
"deleted_at": null,
"id": 135105,
"tenant": "5824940",
"audit_period_beginning": "2014-01-17 00:00:00",
"fail_reason": null,
"instance": "860b5df0-d58b-498d-8838-7156d701732c",
"instance_flavor_id": "2",
"launched_at": "2014-01-17 16:10:08",
"os_version": "5.9",
"usage": 91937,
"send_status": 201,
"message_id": "0a6b1c58-8443-4788-ac08-05cd03e6be53",
"delete": null
}
]
}
db/usage/glance/exists/
=======================
.. http:get:: http://example.com/db/usage/glance/exists/
Returns a list of instance exists matching provided query criteria.
**Query Parameters**
* ``audit_period_beginning_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_beginning_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_ending_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``audit_period_ending_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``created_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``created_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``deleted_at_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``deleted_at_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``received_min``: datetime (yyyy-mm-dd hh:mm:ss)
* ``received_max``: datetime (yyyy-mm-dd hh:mm:ss)
* ``limit``: int, default: 50, max: 1000
* ``offset``: int, default: 0
**Example request**:
.. sourcecode:: http
GET /db/usage/glance/exists/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"exists":
[
{
"status": "verified",
"audit_period_beginning": "2014-01-13 00:00:00",
"fail_reason": null,
"uuid": "d39a04bd-6ba0-4d20-8591-937ab43897dc",
"usage": 2553,
"created_at": "2013-05-11 15:37:34",
"size": 11213393920,
"owner": "389886",
"message_id": "9c5fd5af-60b4-45ad-b524-c4a9964f31e4",
"raw": 283303,
"audit_period_ending": "2014-01-13 23:59:59",
"received": "2014-01-13 09:20:02.777965",
"deleted_at": null,
"send_status": 0,
"id": 5301,
"delete": null
},
{
"status": "verified",
"audit_period_beginning": "2014-01-13 00:00:00",
"fail_reason": null,
"uuid": "6713c136-0555-4a93-b726-edb181d4b69e",
"usage": 1254,
"created_at": "2013-05-11 15:37:56",
"size": 11254732800,
"owner": "389886",
"message_id": "9c5fd5af-60b4-45ad-b524-c4a9964f31e4",
"raw": 283303,
"audit_period_ending": "2014-01-13 23:59:59",
"received": "2014-01-13 09:20:02.777965",
"deleted_at": null,
"send_status": 0,
"id": 5300,
"delete": null
}
]
}
db/usage/exists/<exist_id>/
===========================
.. http:get:: http://example.com/db/usage/exists/<exist_id>
Deprecated, see: :ref:`dbapi-nova-exist`
.. _dbapi-nova-exist:
db/usage/nova/exists/<exist_id>/
================================
.. http:get:: http://example.com/db/usage/nova/exists/<exist_id>
Returns a single instance exists matching provided id
**Example request**:
.. sourcecode:: http
GET /db/usage/nova/exists/135105/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"exist":
{
"status": "verified",
"os_distro": "org.centos",
"bandwidth_public_out": 0,
"received": "2014-01-17 16:10:42.112505",
"instance_type_id": "2",
"raw": 14615459,
"os_architecture": "x64",
"rax_options": "0",
"audit_period_ending": "2014-01-17 16:10:42",
"deleted_at": null,
"id": 135105,
"tenant": "5824940",
"audit_period_beginning": "2014-01-17 00:00:00",
"fail_reason": null,
"instance": "860b5df0-d58b-498d-8838-7156d701732c",
"instance_flavor_id": "2",
"launched_at": "2014-01-17 16:10:08",
"os_version": "5.9",
"usage": 91937,
"send_status": 201,
"message_id": "0a6b1c58-8443-4788-ac08-05cd03e6be53",
"delete": null
}
}
db/usage/glance/exists/<exist_id>/
==================================
.. http:get:: http://example.com/db/usage/glance/exists/<exist_id>/
Returns a single instance exists matching provided id
**Example request**:
.. sourcecode:: http
GET /db/usage/glance/exists/5300/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: application/json
{
"exist":
{
"status": "verified",
"audit_period_beginning": "2014-01-13 00:00:00",
"fail_reason": null,
"uuid": "6713c136-0555-4a93-b726-edb181d4b69e",
"usage": 1254,
"created_at": "2013-05-11 15:37:56",
"size": 11254732800,
"owner": "389886",
"message_id": "9c5fd5af-60b4-45ad-b524-c4a9964f31e4",
"raw": 283303,
"audit_period_ending": "2014-01-13 23:59:59",
"received": "2014-01-13 09:20:02.777965",
"deleted_at": null,
"send_status": 0,
"id": 5300,
"delete": null
}
}
/db/repair
==========
.. http:post:: http://example.com/db/repair/
Changes the status of all the exists of message-ids sent with the request
from 'pending' to 'sent_unverified' so that the verifier does not end up
sending .verified for all those exists(since the .exists have already been
modified as .verified and sent to AH by Yagi). It sends back the message-ids
of exists which could not be updated in the json response.
**Example request**:
.. sourcecode::http
POST /db/repair/ HTTP/1.1
Host: example.com
Accept: application/json
**Example response**:
.. sourcecode:: http
HTTP/1.1 200 OK
Vary: Accept
Content-Type: text/json
{
u'exists_not_pending': [u'494ebfce-0219-4b62-b810-79039a279620'],
u'absent_exists': [u'7609f3b2-3694-4b6f-869e-2f13ae504cb2',
u'0c64032e-4a60-44c0-a99d-5a4f2e46afb0']
}
:query message_ids: list of message_ids of exists messages
:query service: ``nova`` or ``glance``. default="nova"

BIN
docs/images/diagram.gif Normal file

Binary file not shown.

After

(image error) Size: 33 KiB

26
docs/index.rst Normal file

@ -0,0 +1,26 @@
.. StackTach documentation master file, created by
sphinx-quickstart on Tue Jan 14 14:34:29 2014.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to StackTach's documentation!
=====================================
Contents:
.. toctree::
:maxdepth: 3
intro
setup
api
usage
dbapi
Indices and tables
==================
* :ref:`genindex`
* :ref:`search`

39
docs/intro.rst Normal file

@ -0,0 +1,39 @@
An Introduction to StackTach
============================
StackTach was initially created as a browser-based debugging tool
for OpenStack Nova. Since that time, StackTach has evolved into a
tool that can do debugging, performance monitoring and perform
audit, validation and reconcilation of Nova and Glance usage in a
manner suitable for billing.
How it works
************
Nearly all OpenStack components are capable of generating
*notifications* when significant events occur. Notifications
are messages placed on the OpenStack queue (generally RabbitMQ)
for consumption by downstream systems.
The OpenStack wiki has info on the `notification format`_.
.. _notification format: http://wiki.openstack.org/SystemUsageData
StackTach has a *worker* that is configured to read these notifications
and store them in a database (ideally a database separate from the
OpenStack production database). From there, StackTach reviews the stream
of notifications to glean usage information and assemble it in an
easy-to-query fashion.
Users can inquire on instances, requests, servers, etc using the
browser interface or command line tool (`Stacky`_).
.. _Stacky: https://github.com/rackerlabs/stacky
.. image:: images/diagram.gif
To get a general sense of notification adoption across OpenStack projects `read this blog post`_
.. _read this blog post: http://www.sandywalsh.com/2013/09/notification-usage-in-openstack-report.html

116
docs/setup.rst Normal file

@ -0,0 +1,116 @@
Installing StackTach
####################
The "Hurry Up" Install Guide
****************************
#. Create a database for StackTach to use. By default, StackTach assumes MySql, but you can modify the settings.py file to others.
#. Install django and the other required libraries listed in ``./etc/pip-requires.txt`` (please let us know if any are missing)
#. Clone this repo
#. Copy and configure the config files in ``./etc`` (see below for details)
#. Create the necessary database tables (python manage.py syncdb) You don't need an administrator account since there are no user profiles used.
#. Configure OpenStack to publish Notifications back into RabbitMQ (see below)
#. Restart the OpenStack services.
#. Run the Worker to start consuming messages. (see below)
#. Run the web server (``python manage.py runserver --insecure``)
#. Point your browser to ``http://127.0.0.1:8000`` (the default server location)
#. Click on stuff, see what happens. You can't hurt anything, it's all read-only.
Of course, this is only suitable for playing around. If you want to get serious about deploying StackTach you should set up a proper webserver and database on standalone servers. There is a lot of data that gets collected by StackTach (depending on your deployment size) ... be warned. Keep an eye on DB size.
.. _stacktach-config-files:
The Config Files
****************
There are two config files for StackTach. The first one tells us where the second one is. A sample of these two files is in ``./etc/sample_*``. Create a local copy of these files and populate them with the appropriate config values as described below.
The ``sample_stacktach_config.sh`` shell script defines the necessary environment variables StackTach needs. Most of these are just information about the database (assuming MySql) but some are a little different. Copy this file and modify it for your environment. ``source`` this
``stacktach_config.sh`` shell script to set up the necessary environment variables.
``STACKTACH_INSTALL_DIR`` should point to where StackTach is running out of. In most cases this will be your repo directory, but it could be elsewhere if your going for a proper deployment.
The StackTach worker needs to know which RabbitMQ servers to listen to. This information is stored in the deployment file. ``STACKTACH_DEPLOYMENTS_FILE`` should point to this json file. To learn more about the deployments file, see further down.
Finally, ``DJANGO_SETTINGS_MODULE`` tells Django where to get its configuration from. This should point to the ``setting.py`` file. You shouldn't have to do much with the ``settings.py`` file and most of what it needs is in these environment variables.
The ``sample_stacktach_worker_config.json`` file tells StackTach where each of the RabbitMQ servers are that it needs to get events from. In most cases you'll only have one entry in this file, but for large multi-cell deployments, this file can get pretty large. It's also handy for setting up one StackTach for each developer environment.
The file is in json format and the main configuration is under the ``deployments`` key, which should contain a list of deployment dictionaries.
A blank worker config file would look like this: ::
{"deployments": [] }
But that's not much fun. A deployment entry would look like this: ::
{"deployments": [
{
"name": "east_coast.prod.cell1",
"durable_queue": false,
"rabbit_host": "10.0.1.1",
"rabbit_port": 5672,
"rabbit_userid": "rabbit",
"rabbit_password": "rabbit",
"rabbit_virtual_host": "/",
"topics": {
"nova": [
{"queue": "notifications.info", "routing_key": "notifications.info"},
{"queue": "notifications.error", "routing_key": "notifications.error"},
]
}
}
]}
where, *name* is whatever you want to call your deployment, and *rabbit_\** are the connectivity details for your rabbit server. It should be the same information in your `nova.conf` file that OpenStack is using. Note, json has no concept of comments, so using ``#``, ``//`` or ``/* */`` as a comment won't work.
By default, Nova uses ephemeral queues. If you are using durable queues, be sure to change the necessary flag here.
The topics section defines which queues to pull notifications from. You should
pull notifications from all related queues (``.error``, ``.info``, ``.warn``, etc)
You can add as many deployments as you like.
Starting the Worker
===================
Note: the worker now uses librabbitmq, be sure to install that first.
``./worker/start_workers.py`` will spawn a worker.py process for each deployment defined. Each worker will consume from a single Rabbit queue.
Configuring Nova to Generate Notifications
==========================================
In the OpenStack service you wish to have generate notifications, add the
following to its ``.conf`` file: ::
--notification_driver=nova.openstack.common.notifier.rpc_notifier
--notification_topics=monitor
**Note:** *This will likely change once the various project switch to ``oslo.messaging``
which uses endpoints to define the notification drivers.*
This will tell OpenStack to publish notifications to a Rabbit exchange starting with
``monitor.*`` ... this may result in ``monitor.info``, ``monitor.error``, etc.
You'll need to restart Nova once these changes are made.
If you're using `DevStack`_ you may want to set up your ``local.conf`` to include the following: ::
[[post-config|$NOVA_CONF]]
[DEFAULT]
notification_driver=nova.openstack.common.notifier.rpc_notifier
notification_topics=notifications,monitor
notify_on_state_change=vm_and_task_state
notify_on_any_change=True
instance_usage_audit=True
instance_usage_audit_period=hour
.. _DevStack: http://devstack.org/
Next Steps
==========
Once you have this working well, you should download and install ``Stacky`` and play with the command line tool.

89
docs/usage.rst Normal file

@ -0,0 +1,89 @@
StackTach Usage Verification
############################
Usage Basics
************
In OpenStack, usage is tracked through notifications. The notifications are emitted by each service as users request changes and each service performs those changes. Services like Nova can also be configured to emitted periodic audit notifications exposing the state of the database at the time of the audit. The periodic audit notifications are useful for billing as it is not necessary to store past states.
But, we want to be sure what we're billing for is correct and that we've received audit notifications for every instance that should be billable. Thus, it is a good idea to track instance state so that periodic audit notifications can be validated against that state. The notifications each service sends as changes are requested and performed are extremely useful for tracking instance state through different billable states.
The idea behind StackTach's Usage Verification is to track changes through instantaneous notifications, then compare them to the periodic audit notifications for correctness. After being validated, StackTach itself will emit a copy of the notification with a new event_type indicating that is has been verified. StackTach also provides a set of scripts which can be used to confirm that exists were sent for all instances in a billable state.
Configuring Usage Verification
******************************
Usage Verification in StackTach is done by a separate verifier process. A sample configuration file can be found at ``./etc/sample_stacktach_verifier_config.sjon``
The default config provides most all settings that are required for the verifier. ::
{
"tick_time": 30,
"settle_time": 5,
"settle_units": "minutes",
"pool_size": 2,
"enable_notifications": true,
"validation_level": "all",
"flavor_field_name": "instance_type_id",
"rabbit": {
"durable_queue": false,
"host": "10.0.0.1",
"port": 5672,
"userid": "rabbit",
"password": "rabbit",
"virtual_host": "/",
"topics": {
"nova": ["notifications.info"],
"glance": ["notifications.info"]
}
}
}
* tick_time: Time in seconds to sleep before attempting to retrieve pending usage entries for verifications
* settle_time: Amount of time between when a usage notification was emitted and when it should be picked up for verification.
* settle_units: Units for the settle_time value
* pool_size: Amount of verifier processes to create for the verifier pool.
* enable_notifications: Whether or not to emit verified notifications.
* validation_level: Determines how strict datatype validation will be on usage notifications. Values are ``none``, ``basic``, and ``all``.
* flavor_field_name: Field to use for flavor verification. Values are ``instance_type_id`` and ``instance_flavor_id``.
* rabbit: Rabbit config, please see :ref:`StackTach install guide <stacktach-config-files>` for rabbit config details.
* The topics here are how the verifier determines which services to verify. For example, Nova and Glance services will be verified and verified notifications will be emitted with a routing_key of notifications.info with our sample config.
* An alternate config that would only verify Nova and emit verified notifications on notifications.info and monitor.info: ::
"topics": {
"nova": ["notifications.info", "monitor.info"]
}
* Other Config Options:
* nova_event_type: Event type to emit for Nova events
* Default: compute.instance.exists.verified.old
* glance_event_type: Event type to emit for Glance events
* Default: image.exists.verified.old
Starting the Verifier
*********************
``./verifier/start_verifier.py`` will spawn a verifier.py process for each service being verified along with a pool of processes to verify each usage entry.
Audit Reports
*************
StackTach also provides a few reports for auditing the audit notifications, which can be useful for confirming all usage was sent for a deployment.
* ``./reports/nova_usage_audit.py``
* Suggested Arguments:
* --period_length: ``day`` or ``year``, default: ``day``
* --utcdatetime: Overrides datetime used to audit, default: current utc datetime
* --store: ``True`` or ``False``, whether or not to store report in StackTach database
* ``./reports/glance_usage_audit``
* Suggested Arguments:
* --period_length: ``day`` or ``year``, default: ``day``
* --utcdatetime: Overrides datetime used to audit, default: current utc datetime
* --store: ``True`` or ``False``, whether or not to store report in StackTach database

@ -1,4 +1,4 @@
Django>=1.4.2
Django>=1.4.2, <1.6.0
MySQL-python>=1.2.3
eventlet>=0.9.17
kombu>=2.4.7
@ -7,4 +7,6 @@ prettytable>=0.7.2
argparse
Pympler
requests
south
south
sphinxcontrib-httpdomain
pbr

@ -4,7 +4,8 @@
"settle_units": "minutes",
"pool_size": 2,
"enable_notifications": true,
"validation_level": "all",
"validation_level": "all",
"flavor_field_name": "instance_type_id",
"rabbit": {
"durable_queue": false,
"host": "10.0.0.1",

@ -176,7 +176,7 @@ def store_results(start, end, summary, details):
'created': dt.dt_to_decimal(datetime.datetime.utcnow()),
'period_start': start,
'period_end': end,
'version': 4,
'version': 6,
'name': 'glance usage audit'
}

@ -224,7 +224,7 @@ def store_results(start, end, summary, details):
'created': dt.dt_to_decimal(datetime.datetime.utcnow()),
'period_start': start,
'period_end': end,
'version': 5,
'version': 6,
'name': 'nova usage audit'
}

@ -9,8 +9,11 @@ def _status_queries(exists_query):
fail = exists_query.filter(status=models.InstanceExists.FAILED)
pending = exists_query.filter(status=models.InstanceExists.PENDING)
verifying = exists_query.filter(status=models.InstanceExists.VERIFYING)
return verified, reconciled, fail, pending, verifying
sent_unverified = exists_query.filter(status=models.InstanceExists.SENT_UNVERIFIED)
sent_failed = exists_query.filter(status=models.InstanceExists.VERIFYING)
sent_verifying = exists_query.filter(status=models.InstanceExists.SENT_VERIFYING)
return verified, reconciled, fail, pending, verifying, sent_unverified, \
sent_failed, sent_verifying
def _send_status_queries(exists_query):
@ -28,7 +31,8 @@ def _send_status_queries(exists_query):
def _audit_for_exists(exists_query):
(verified, reconciled,
fail, pending, verifying) = _status_queries(exists_query)
fail, pending, verifying, sent_unverified,
sent_failed, sent_verifying) = _status_queries(exists_query)
(success, unsent, redirect,
client_error, server_error) = _send_status_queries(verified)
@ -43,6 +47,9 @@ def _audit_for_exists(exists_query):
'failed': fail.count(),
'pending': pending.count(),
'verifying': verifying.count(),
'sent_unverified': sent_unverified.count(),
'sent_failed': sent_failed.count(),
'sent_verifying': sent_verifying.count(),
'send_status': {
'success': success.count(),
'unsent': unsent.count(),

@ -0,0 +1,10 @@
{
"host": "devstack.example.com",
"port": 5672,
"userid": "guest",
"password": "password",
"durable_queue": false,
"exchange": "nova",
"virtual_host": "/",
"routing_key": "monitor.info"
}

@ -0,0 +1,75 @@
import argparse
import json
import os
import sys
import time
sys.path.append(os.environ.get('STACKTACH_INSTALL_DIR', '/stacktach'))
from stacktach import message_service as msg
from stacktach import utils
import scrubbers
def scrub(args, send_notif=lambda x: None):
print "Starting scrub."
start = utils.str_time_to_unix(args.start)
end = utils.str_time_to_unix(args.end)
if hasattr(scrubbers, args.scrubber):
Scrubber = getattr(scrubbers, args.scrubber)
scrubber = Scrubber(start, end)
count = 0
for raw in scrubber.raws():
matches, body = scrubber.filter(raw)
if matches and not body:
body = json.loads(raw['json'])[1]
if matches and body:
scrubbed = scrubber.scrub(body)
count += 1
send_notif(scrubbed)
return count
else:
print "No scrubber class %s." % args.scrubber
return 0
def scrub_with_notifications(args):
print "!!!!!! WARNING: SENDING TO RABBIT !!!!!!"
print "!!!!!! Sleeping for 30 seconds !!!!!!"
print "!!!!!! before proceeding !!!!!!"
time.sleep(30)
with open(args.rabbit_config) as fp:
rabbit_config = json.load(fp)
exchange = msg.create_exchange(rabbit_config['exchange'],
'topic',
durable=rabbit_config['durable_queue'])
conn_conf = (rabbit_config['host'], rabbit_config['port'],
rabbit_config['userid'], rabbit_config['password'],
'librabbitmq', rabbit_config['virtual_host'])
with msg.create_connection(*conn_conf) as conn:
def send_notif(notif):
msg.send_notification(notif, rabbit_config['routing_key'],
conn, exchange)
count = scrub(args, send_notif=send_notif)
return count
if __name__ == '__main__':
parser = argparse.ArgumentParser('Stacktach Notification Scrubber')
parser.add_argument('--rabbit', action='store_true')
parser.add_argument('--rabbit_config', default='rabbit_config.json')
parser.add_argument('--scrubber', required=True)
parser.add_argument('--start', required=True)
parser.add_argument('--end', required=True)
args = parser.parse_args()
if args.rabbit:
print "%s Events Scrubbed" % scrub_with_notifications(args)
else:
print "%s Events Scrubbed" % scrub(args)

70
scripts/scrubbers.py Normal file

@ -0,0 +1,70 @@
import json
import uuid
from django.db.models import F
from stacktach import models
class ScrubberBase(object):
def __init__(self, start, end):
self.start = start
self.end = end
def raws(self):
""" Returns an iterable of Raws to scrub
"""
return [].__iter__()
def filter(self, raw_data):
""" Returns whether or not the provided RawData needs to be scrubbed.
If the implementing function parses the json body to determine
if it needs to be scrubbed, it should be returned as the second
return value. This is done so that it will not need to be parsed
a second time for scrubbing. Negative matches need not return
parsed json bodies
@raw_data: a RawData dictionary
"""
return True, None
def scrub(self, body):
""" Returns the scrubbed json body of the RawData.
@body: Dictionary version of the RawData's json.
"""
return body
class ExistsCreatedAt(ScrubberBase):
def raws(self):
filters = {
'raw__when__gte': self.start,
'raw__when__lte': self.end,
'audit_period_ending__lt': F('audit_period_beginning') + (60*60*24)
}
exists = models.InstanceExists.objects.filter(**filters)
exists = exists.select_related('raw')
for exist in exists.iterator():
rawdata = exist.raw
yield {'json': rawdata.json}
def filter(self, raw_data):
if '+00:00' in raw_data['json']:
body = json.loads(raw_data['json'])[1]
created_at = body.get('payload', {}).get('created_at')
if created_at and '+00:00' in created_at:
return True, body
else:
return False, None
else:
return False, None
def scrub(self, body):
created_at = body['payload']['created_at']
scrubbed_created_at = created_at.replace('+00:00', '')
body['payload']['created_at'] = scrubbed_created_at
body['message_id'] = str(uuid.uuid4())
return body

@ -167,3 +167,7 @@ LOGGING = {
},
}
}
# Force use of the pickle serializer as a workaound for django-1.6. See:
# https://docs.djangoproject.com/en/dev/releases/1.6/#default-session-serialization-switched-to-json
SESSION_SERIALIZER='django.contrib.sessions.serializers.PickleSerializer'

22
setup.cfg Normal file

@ -0,0 +1,22 @@
[metadata]
name = stacktach
author = Dark Secret Software Inc., Rackspace Hosting
author-email = admin@darksecretsoftware.com
summary = OpenStack Monitoring and Billing
description-file = README.md
license = Apache-2
classifier =
Development Status :: 2 - Pre-Alpha
Environment :: Console
Intended Audience :: Developers
Intended Audience :: Information Technology
License :: OSI Approved :: Apache Software License
Operating System :: OS Independent
Programming Language :: Python
Topic :: Software Development :: Libraries :: Python Modules
keywords =
setup
distutils
[files]
packages =
stacktach

8
setup.py Normal file

@ -0,0 +1,8 @@
#!/usr/bin/env python
from setuptools import setup
setup(
setup_requires=['pbr'],
pbr=True,
)

@ -21,8 +21,10 @@
import decimal
import functools
import json
from datetime import datetime
from django.db import transaction
from django.db.models import Count
from django.db.models import FieldDoesNotExist
from django.forms.models import model_to_dict
from django.http import HttpResponse
@ -38,6 +40,7 @@ from stacktach import utils
DEFAULT_LIMIT = 50
HARD_LIMIT = 1000
HARD_WHEN_RANGE_LIMIT = 7 * 24 * 60 * 60 # 7 Days
class APIException(Exception):
@ -79,13 +82,6 @@ def _log_api_exception(cls, ex, request):
stacklog.error(msg)
def _exists_model_factory(service):
if service == 'glance':
return models.ImageExists
elif service == 'nova':
return models.InstanceExists
def api_call(func):
@functools.wraps(func)
@ -108,28 +104,85 @@ def api_call(func):
return handled
def _usage_model_factory(service):
if service == 'nova':
return {'klass': models.InstanceUsage, 'order_by': 'launched_at'}
if service == 'glance':
return {'klass': models.ImageUsage, 'order_by': 'created_at'}
def _exists_model_factory(service):
if service == 'nova':
return {'klass': models.InstanceExists, 'order_by': 'id'}
if service == 'glance':
return {'klass': models.ImageExists, 'order_by': 'id'}
def _deletes_model_factory(service):
if service == 'nova':
return {'klass': models.InstanceDeletes, 'order_by': 'launched_at'}
if service == 'glance':
return {'klass': models.ImageDeletes, 'order_by': 'deleted_at'}
@api_call
def list_usage_launches(request):
objects = get_db_objects(models.InstanceUsage, request, 'launched_at')
dicts = _convert_model_list(objects)
return {'launches': dicts}
return {'launches': list_usage_launches_with_service(request, 'nova')}
@api_call
def list_usage_images(request):
return { 'images': list_usage_launches_with_service(request, 'glance')}
def list_usage_launches_with_service(request, service):
model = _usage_model_factory(service)
objects = get_db_objects(model['klass'], request,
model['order_by'])
dicts = _convert_model_list(objects)
return dicts
def get_usage_launch_with_service(launch_id, service):
model = _usage_model_factory(service)
return {'launch': _get_model_by_id(model['klass'], launch_id)}
@api_call
def get_usage_launch(request, launch_id):
return {'launch': _get_model_by_id(models.InstanceUsage, launch_id)}
return get_usage_launch_with_service(launch_id, 'nova')
@api_call
def get_usage_image(request, image_id):
return get_usage_launch_with_service(image_id, 'glance')
@api_call
def list_usage_deletes(request):
objects = get_db_objects(models.InstanceDeletes, request, 'launched_at')
return list_usage_deletes_with_service(request, 'nova')
@api_call
def list_usage_deletes_glance(request):
return list_usage_deletes_with_service(request, 'glance')
def list_usage_deletes_with_service(request, service):
model = _deletes_model_factory(service)
objects = get_db_objects(model['klass'], request,
model['order_by'])
dicts = _convert_model_list(objects)
return {'deletes': dicts}
@api_call
def get_usage_delete(request, delete_id):
return {'delete': _get_model_by_id(models.InstanceDeletes, delete_id)}
model = _deletes_model_factory('nova')
return {'delete': _get_model_by_id(model['klass'], delete_id)}
@api_call
def get_usage_delete_glance(request, delete_id):
model = _deletes_model_factory('glance')
return {'delete': _get_model_by_id(model['klass'], delete_id)}
def _exists_extra_values(exist):
@ -139,23 +192,18 @@ def _exists_extra_values(exist):
@api_call
def list_usage_exists(request):
try:
custom_filters = {}
if 'received_min' in request.GET:
received_min = request.GET['received_min']
custom_filters['received_min'] = {}
custom_filters['received_min']['raw__when__gte'] = \
utils.str_time_to_unix(received_min)
if 'received_max' in request.GET:
received_max = request.GET['received_max']
custom_filters['received_max'] = {}
custom_filters['received_max']['raw__when__lte'] = \
utils.str_time_to_unix(received_max)
except AttributeError:
msg = "Range filters must be dates."
raise BadRequestException(message=msg)
return list_usage_exists_with_service(request, 'nova')
objects = get_db_objects(models.InstanceExists, request, 'id',
@api_call
def list_usage_exists_glance(request):
return list_usage_exists_with_service(request, 'glance')
def list_usage_exists_with_service(request, service):
model = _exists_model_factory(service)
custom_filters = _get_exists_filter_args(request)
objects = get_db_objects(model['klass'], request, 'id',
custom_filters=custom_filters)
dicts = _convert_model_list(objects, _exists_extra_values)
return {'exists': dicts}
@ -166,6 +214,33 @@ def get_usage_exist(request, exist_id):
return {'exist': _get_model_by_id(models.InstanceExists, exist_id,
_exists_extra_values)}
@api_call
def get_usage_exist_glance(request, exist_id):
return {'exist': _get_model_by_id(models.ImageExists, exist_id,
_exists_extra_values)}
@api_call
def get_usage_exist_stats(request):
return {'stats': _get_exist_stats(request, 'nova')}
@api_call
def get_usage_exist_stats_glance(request):
return {'stats': _get_exist_stats(request, 'glance')}
def _get_exist_stats(request, service):
klass = _exists_model_factory(service)['klass']
exists_filters = _get_exists_filter_args(request)
filters = _get_filter_args(klass, request,
custom_filters=exists_filters)
for value in exists_filters.values():
filters.update(value)
query = klass.objects.filter(**filters)
values = query.values('status', 'send_status')
stats = values.annotate(event_count=Count('send_status'))
return list(stats)
@api_call
def exists_send_status(request, message_id):
@ -210,7 +285,7 @@ def _find_exists_with_message_id(msg_id, exists_model, service):
def _ping_processing_with_service(pings, service):
exists_model = _exists_model_factory(service)
exists_model = _exists_model_factory(service)['klass']
with transaction.commit_on_success():
for msg_id, status_code in pings.items():
try:
@ -265,6 +340,25 @@ def _check_has_field(klass, field_name):
raise BadRequestException(msg)
def _get_exists_filter_args(request):
try:
custom_filters = {}
if 'received_min' in request.GET:
received_min = request.GET['received_min']
custom_filters['received_min'] = {}
custom_filters['received_min']['raw__when__gte'] = \
utils.str_time_to_unix(received_min)
if 'received_max' in request.GET:
received_max = request.GET['received_max']
custom_filters['received_max'] = {}
custom_filters['received_max']['raw__when__lte'] = \
utils.str_time_to_unix(received_max)
except AttributeError:
msg = "Range filters must be dates."
raise BadRequestException(message=msg)
return custom_filters
def _get_filter_args(klass, request, custom_filters=None):
filter_args = {}
if 'instance' in request.GET:
@ -353,3 +447,69 @@ def _convert_model_list(model_list, extra_values_func=None):
converted.append(_convert_model(item, extra_values_func))
return converted
def _rawdata_factory(service):
if service == "nova":
rawdata = models.RawData.objects
elif service == "glance":
rawdata = models.GlanceRawData.objects
else:
raise BadRequestException(message="Invalid service")
return rawdata
@api_call
def get_event_stats(request):
try:
filters = {}
if 'when_min' in request.GET or 'when_max' in request.GET:
if not ('when_min' in request.GET and 'when_max' in request.GET):
msg = "When providing date range filters, " \
"a min and max are required."
raise BadRequestException(message=msg)
when_min = utils.str_time_to_unix(request.GET['when_min'])
when_max = utils.str_time_to_unix(request.GET['when_max'])
if when_max - when_min > HARD_WHEN_RANGE_LIMIT:
msg = "Date ranges may be no larger than %s seconds"
raise BadRequestException(message=msg % HARD_WHEN_RANGE_LIMIT)
filters['when__lte'] = when_max
filters['when__gte'] = when_min
service = request.GET.get("service", "nova")
rawdata = _rawdata_factory(service)
if filters:
rawdata = rawdata.filter(**filters)
events = rawdata.values('event').annotate(event_count=Count('event'))
events = list(events)
if 'event' in request.GET:
event = request.GET['event']
default = {'event': event, 'event_count': 0}
events = [x for x in events if x['event'] == event] or [default, ]
return {'stats': events}
except (KeyError, TypeError):
raise BadRequestException(message="Invalid/absent query parameter")
except (ValueError, AttributeError):
raise BadRequestException(message="Invalid format for date (Correct "
"format should be %Y-%m-%d %H:%M:%S)")
def repair_stacktach_down(request):
post_dict = dict((request.POST._iterlists()))
message_ids = post_dict.get('message_ids')
service = post_dict.get('service', ['nova'])
klass = _exists_model_factory(service[0])['klass']
absent_exists, exists_not_pending = \
klass.mark_exists_as_sent_unverified(message_ids)
response_data = {'absent_exists': absent_exists,
'exists_not_pending': exists_not_pending}
response = HttpResponse(json.dumps(response_data),
content_type="application/json")
return response

@ -256,12 +256,18 @@ class InstanceExists(models.Model):
VERIFIED = 'verified'
RECONCILED = 'reconciled'
FAILED = 'failed'
SENT_UNVERIFIED = 'sent_unverified'
SENT_FAILED = 'sent_failed'
SENT_VERIFYING = 'sent_verifying'
STATUS_CHOICES = [
(PENDING, 'Pending Verification'),
(VERIFYING, 'Currently Being Verified'),
(VERIFIED, 'Passed Verification'),
(RECONCILED, 'Passed Verification After Reconciliation'),
(FAILED, 'Failed Verification'),
(SENT_UNVERIFIED, 'Unverified but sent by Yagi'),
(SENT_FAILED, 'Failed Verification but sent by Yagi'),
(SENT_VERIFYING, 'Currently being verified but sent by Yagi')
]
instance = models.CharField(max_length=50, null=True,
@ -321,7 +327,10 @@ class InstanceExists(models.Model):
self.save()
def mark_failed(self, reason=None):
self.status = InstanceExists.FAILED
if self.status == InstanceExists.SENT_VERIFYING:
self.status = InstanceExists.SENT_FAILED
else:
self.status = InstanceExists.FAILED
if reason:
self.fail_reason = reason
self.save()
@ -329,6 +338,24 @@ class InstanceExists(models.Model):
def update_status(self, new_status):
self.status = new_status
@staticmethod
def mark_exists_as_sent_unverified(message_ids):
absent_exists = []
exists_not_pending = []
for message_id in message_ids:
try:
exists = InstanceExists.objects.get(message_id=message_id)
if exists.status == InstanceExists.PENDING:
exists.status = InstanceExists.SENT_UNVERIFIED
exists.save()
else:
exists_not_pending.append(message_id)
except Exception:
absent_exists.append(message_id)
return absent_exists, exists_not_pending
class Timing(models.Model):
"""Each Timing record corresponds to a .start/.end event pair
@ -458,11 +485,17 @@ class ImageExists(models.Model):
VERIFYING = 'verifying'
VERIFIED = 'verified'
FAILED = 'failed'
SENT_UNVERIFIED = 'sent_unverified'
SENT_FAILED = 'sent_failed'
SENT_VERIFYING = 'sent_verifying'
STATUS_CHOICES = [
(PENDING, 'Pending Verification'),
(VERIFYING, 'Currently Being Verified'),
(VERIFIED, 'Passed Verification'),
(FAILED, 'Failed Verification'),
(SENT_UNVERIFIED, 'Unverified but sent by Yagi'),
(SENT_FAILED, 'Failed Verification but sent by Yagi'),
(SENT_VERIFYING, 'Currently being verified but sent by Yagi')
]
uuid = models.CharField(max_length=50, db_index=True, null=True)
@ -513,11 +546,32 @@ class ImageExists(models.Model):
self.save()
def mark_failed(self, reason=None):
self.status = InstanceExists.FAILED
if self.status == ImageExists.SENT_VERIFYING:
self.status = ImageExists.SENT_FAILED
else:
self.status = ImageExists.FAILED
if reason:
self.fail_reason = reason
self.save()
@staticmethod
def mark_exists_as_sent_unverified(message_ids):
absent_exists = []
exists_not_pending = []
for message_id in message_ids:
exists_list = ImageExists.objects.filter(message_id=message_id)
if exists_list:
for exists in exists_list:
if exists.status == ImageExists.PENDING:
exists.status = ImageExists.SENT_UNVERIFIED
exists.save()
else:
exists_not_pending.append(message_id)
else :
absent_exists.append(message_id)
return absent_exists, exists_not_pending
def get_model_fields(model):
return model._meta.fields

@ -1,3 +1,4 @@
from copy import deepcopy
import decimal
import datetime
import json
@ -9,7 +10,7 @@ from django.shortcuts import get_object_or_404
import datetime_to_decimal as dt
import models
import utils
from django.core.exceptions import ObjectDoesNotExist, FieldError
from django.core.exceptions import ObjectDoesNotExist, FieldError, ValidationError
SECS_PER_HOUR = 60 * 60
SECS_PER_DAY = SECS_PER_HOUR * 24
@ -17,6 +18,8 @@ SECS_PER_DAY = SECS_PER_HOUR * 24
DEFAULT_LIMIT = 50
HARD_LIMIT = 1000
UTC_FORMAT = '%Y-%m-%d %H:%M:%S'
def _get_limit(request):
limit = request.GET.get('limit', DEFAULT_LIMIT)
@ -619,3 +622,90 @@ def search(request):
except FieldError:
return error_response(400, 'Bad Request', "The requested field '%s' does not exist for the corresponding object.\n"
"Note: The field names of database are case-sensitive." % field)
class BadRequestException(Exception):
pass
def _parse_created(created):
try:
created_datetime = datetime.datetime.strptime(created, '%Y-%m-%d')
return dt.dt_to_decimal(created_datetime)
except ValueError:
raise BadRequestException(
"'%s' value has an invalid format. It must be in YYYY-MM-DD format."
% created)
def _parse_id(id):
try:
return int(id)
except ValueError:
raise BadRequestException(
"'%s' value has an invalid format. It must be in integer "
"format." % id)
def _parse_fields_and_create_query_filters(request_filters):
query_filters = {}
for field, value in request_filters.iteritems():
if field == 'created':
decimal_created = _parse_created(value)
query_filters['created__gt'] = decimal_created
query_filters['created__lt'] = decimal_created + SECS_PER_DAY
elif field == 'id':
id = _parse_id(value)
query_filters['id__exact'] = id
else:
query_filters[field + '__exact'] = value
return query_filters
def _check_if_fields_searchable(request_filters):
allowed_fields = ['id', 'name', 'created', 'period_start', 'period_end']
invalid_fields = [field for field in request_filters.keys()
if field not in allowed_fields]
if invalid_fields:
raise BadRequestException(
"The requested fields either do not exist for the corresponding "
"object or are not searchable: %s. Note: The field names of "
"database are case-sensitive." %
', '.join(sorted(invalid_fields)))
def _create_query_filters(request):
request_filters = deepcopy(request.GET)
request_filters.pop('limit', None)
request_filters.pop('offset', None)
_check_if_fields_searchable(request_filters)
return _parse_fields_and_create_query_filters(request_filters)
def do_jsonreports_search(request):
try:
model = models.JsonReport
filters = _create_query_filters(request)
reports = model_search(request, model.objects, filters,
order_by='-id')
results = [['Id', 'Start', 'End', 'Created', 'Name', 'Version']]
for report in reports:
results.append([report.id,
datetime.datetime.strftime(
report.period_start, UTC_FORMAT),
datetime.datetime.strftime(
report.period_end, UTC_FORMAT),
datetime.datetime.strftime(
dt.dt_from_decimal(report.created),
UTC_FORMAT),
report.name,
report.version])
except BadRequestException as be:
return error_response(400, 'Bad Request', be.message)
except ValidationError as ve:
return error_response(400, 'Bad Request', ve.messages[0])
return rsp(json.dumps(results))

@ -7,47 +7,8 @@ web_logger = stacklog.get_logger('stacktach-web')
web_logger_listener = stacklog.LogListener(web_logger)
web_logger_listener.start()
urlpatterns = patterns('',
web_urls = (
url(r'^$', 'stacktach.views.welcome', name='welcome'),
url(r'stacky/deployments/$', 'stacktach.stacky_server.do_deployments'),
url(r'stacky/events/$', 'stacktach.stacky_server.do_events'),
url(r'stacky/hosts/$', 'stacktach.stacky_server.do_hosts'),
url(r'stacky/uuid/$', 'stacktach.stacky_server.do_uuid'),
url(r'stacky/timings/$', 'stacktach.stacky_server.do_timings'),
url(r'stacky/timings/uuid/$', 'stacktach.stacky_server.do_timings_uuid'),
url(r'stacky/summary/$', 'stacktach.stacky_server.do_summary'),
url(r'stacky/request/$', 'stacktach.stacky_server.do_request'),
url(r'stacky/reports/$', 'stacktach.stacky_server.do_jsonreports'),
url(r'stacky/report/(?P<report_id>\d+)/$',
'stacktach.stacky_server.do_jsonreport'),
url(r'stacky/show/(?P<event_id>\d+)/$',
'stacktach.stacky_server.do_show'),
url(r'stacky/watch/(?P<deployment_id>\d+)/$',
'stacktach.stacky_server.do_watch'),
url(r'stacky/search/$', 'stacktach.stacky_server.search'),
url(r'stacky/kpi/$', 'stacktach.stacky_server.do_kpi'),
url(r'stacky/kpi/(?P<tenant_id>\w+)/$', 'stacktach.stacky_server.do_kpi'),
url(r'stacky/usage/launches/$',
'stacktach.stacky_server.do_list_usage_launches'),
url(r'stacky/usage/deletes/$',
'stacktach.stacky_server.do_list_usage_deletes'),
url(r'stacky/usage/exists/$',
'stacktach.stacky_server.do_list_usage_exists'),
url(r'db/usage/launches/$',
'stacktach.dbapi.list_usage_launches'),
url(r'db/usage/launches/(?P<launch_id>\d+)/$',
'stacktach.dbapi.get_usage_launch'),
url(r'db/usage/deletes/$',
'stacktach.dbapi.list_usage_deletes'),
url(r'db/usage/deletes/(?P<delete_id>\d+)/$',
'stacktach.dbapi.get_usage_delete'),
url(r'db/usage/exists/$', 'stacktach.dbapi.list_usage_exists'),
url(r'db/usage/exists/(?P<exist_id>\d+)/$',
'stacktach.dbapi.get_usage_exist'),
url(r'db/confirm/usage/exists/(?P<message_id>[\w\-]+)/$',
'stacktach.dbapi.exists_send_status'),
url(r'^(?P<deployment_id>\d+)/$', 'stacktach.views.home', name='home'),
url(r'^(?P<deployment_id>\d+)/details/(?P<column>\w+)/(?P<row_id>\d+)/$',
'stacktach.views.details', name='details'),
@ -60,3 +21,78 @@ urlpatterns = patterns('',
url(r'^(?P<deployment_id>\d+)/instance_status/$',
'stacktach.views.instance_status', name='instance_status'),
)
stacky_urls = (
url(r'^stacky/deployments/$', 'stacktach.stacky_server.do_deployments'),
url(r'^stacky/events/$', 'stacktach.stacky_server.do_events'),
url(r'^stacky/hosts/$', 'stacktach.stacky_server.do_hosts'),
url(r'^stacky/uuid/$', 'stacktach.stacky_server.do_uuid'),
url(r'^stacky/timings/$', 'stacktach.stacky_server.do_timings'),
url(r'^stacky/timings/uuid/$', 'stacktach.stacky_server.do_timings_uuid'),
url(r'^stacky/summary/$', 'stacktach.stacky_server.do_summary'),
url(r'^stacky/request/$', 'stacktach.stacky_server.do_request'),
url(r'^stacky/reports/search/$',
'stacktach.stacky_server.do_jsonreports_search'),
url(r'^stacky/reports/$', 'stacktach.stacky_server.do_jsonreports'),
url(r'^stacky/report/(?P<report_id>\d+)/$',
'stacktach.stacky_server.do_jsonreport'),
url(r'^stacky/show/(?P<event_id>\d+)/$',
'stacktach.stacky_server.do_show'),
url(r'^stacky/watch/(?P<deployment_id>\d+)/$',
'stacktach.stacky_server.do_watch'),
url(r'^stacky/search/$', 'stacktach.stacky_server.search'),
url(r'^stacky/kpi/$', 'stacktach.stacky_server.do_kpi'),
url(r'^stacky/kpi/(?P<tenant_id>\w+)/$', 'stacktach.stacky_server.do_kpi'),
url(r'^stacky/usage/launches/$',
'stacktach.stacky_server.do_list_usage_launches'),
url(r'^stacky/usage/deletes/$',
'stacktach.stacky_server.do_list_usage_deletes'),
url(r'^stacky/usage/exists/$',
'stacktach.stacky_server.do_list_usage_exists'),
)
dbapi_urls = (
url(r'^db/usage/launches/$',
'stacktach.dbapi.list_usage_launches'),
url(r'^db/usage/nova/launches/$',
'stacktach.dbapi.list_usage_launches'),
url(r'^db/usage/glance/images/$',
'stacktach.dbapi.list_usage_images'),
url(r'^db/usage/launches/(?P<launch_id>\d+)/$',
'stacktach.dbapi.get_usage_launch'),
url(r'^db/usage/nova/launches/(?P<launch_id>\d+)/$',
'stacktach.dbapi.get_usage_launch'),
url(r'^db/usage/glance/images/(?P<image_id>\d+)/$',
'stacktach.dbapi.get_usage_image'),
url(r'^db/usage/deletes/$',
'stacktach.dbapi.list_usage_deletes'),
url(r'^db/usage/nova/deletes/$',
'stacktach.dbapi.list_usage_deletes'),
url(r'^db/usage/glance/deletes/$',
'stacktach.dbapi.list_usage_deletes_glance'),
url(r'^db/usage/deletes/(?P<delete_id>\d+)/$',
'stacktach.dbapi.get_usage_delete'),
url(r'^db/usage/nova/deletes/(?P<delete_id>\d+)/$',
'stacktach.dbapi.get_usage_delete'),
url(r'^db/usage/glance/deletes/(?P<delete_id>\d+)/$',
'stacktach.dbapi.get_usage_delete_glance'),
url(r'^db/usage/exists/$', 'stacktach.dbapi.list_usage_exists'),
url(r'^db/usage/nova/exists/$', 'stacktach.dbapi.list_usage_exists'),
url(r'^db/usage/glance/exists/$', 'stacktach.dbapi.list_usage_exists_glance'),
url(r'^db/usage/exists/(?P<exist_id>\d+)/$',
'stacktach.dbapi.get_usage_exist'),
url(r'^db/usage/nova/exists/(?P<exist_id>\d+)/$',
'stacktach.dbapi.get_usage_exist'),
url(r'^db/usage/glance/exists/(?P<exist_id>\d+)/$',
'stacktach.dbapi.get_usage_exist_glance'),
url(r'^db/confirm/usage/exists/(?P<message_id>[\w\-]+)/$',
'stacktach.dbapi.exists_send_status'),
url(r'^db/stats/nova/exists/$',
'stacktach.dbapi.get_usage_exist_stats'),
url(r'^db/stats/glance/exists/$',
'stacktach.dbapi.get_usage_exist_stats_glance'),
url(r'^db/stats/events/', 'stacktach.dbapi.get_event_stats'),
url(r'^db/repair/', 'stacktach.dbapi.repair_stacktach_down'),
)
urlpatterns = patterns('', *(web_urls + stacky_urls + dbapi_urls))

@ -19,8 +19,10 @@
# IN THE SOFTWARE.
import datetime
from decimal import Decimal
import json
from django.db.models import Count
from django.db.models import FieldDoesNotExist
from django.db import transaction
import mox
@ -41,8 +43,15 @@ class DBAPITestCase(StacktachBaseTestCase):
self.mox = mox.Mox()
dne_exception = models.InstanceExists.DoesNotExist
mor_exception = models.InstanceExists.MultipleObjectsReturned
self.mox.StubOutWithMock(models, 'RawData',
use_mock_anything=True)
self.mox.StubOutWithMock(models, 'InstanceExists',
use_mock_anything=True)
self.mox.StubOutWithMock(models, 'ImageExists',
use_mock_anything=True)
models.RawData.objects = self.mox.CreateMockAnything()
models.InstanceExists._meta = self.mox.CreateMockAnything()
models.ImageExists._meta = self.mox.CreateMockAnything()
models.InstanceExists.objects = self.mox.CreateMockAnything()
models.ImageExists.objects = self.mox.CreateMockAnything()
models.InstanceExists.DoesNotExist = dne_exception
@ -124,8 +133,8 @@ class DBAPITestCase(StacktachBaseTestCase):
fake_request = self.mox.CreateMockAnything()
fake_request.GET = {'somebadfield_max': str(start_time)}
fake_model = self.make_fake_model()
fake_model._meta.get_field_by_name('somebadfield')\
.AndRaise(FieldDoesNotExist())
fake_model._meta.get_field_by_name('somebadfield') \
.AndRaise(FieldDoesNotExist())
self.mox.ReplayAll()
self.assertRaises(dbapi.BadRequestException, dbapi._get_filter_args,
@ -307,7 +316,8 @@ class DBAPITestCase(StacktachBaseTestCase):
fake_request.GET = filters
self.mox.StubOutWithMock(dbapi, '_get_filter_args')
dbapi._get_filter_args(fake_model, fake_request,
custom_filters=custom_filters).AndReturn(filters)
custom_filters=custom_filters).AndReturn(
filters)
self.mox.StubOutWithMock(dbapi, '_check_has_field')
dbapi._check_has_field(fake_model, 'id')
result = self.mox.CreateMockAnything()
@ -326,7 +336,7 @@ class DBAPITestCase(StacktachBaseTestCase):
self.mox.VerifyAll()
def test_list_usage_exists_no_custom_filters(self):
def test_list_usage_exists_no_custom_filters_for_nova(self):
fake_request = self.mox.CreateMockAnything()
fake_request.GET = {}
self.mox.StubOutWithMock(dbapi, 'get_db_objects')
@ -340,6 +350,20 @@ class DBAPITestCase(StacktachBaseTestCase):
self.assertEqual(resp.status_code, 200)
self.mox.VerifyAll()
def test_list_usage_exists_no_custom_filters_for_glance(self):
fake_request = self.mox.CreateMockAnything()
fake_request.GET = {}
self.mox.StubOutWithMock(dbapi, 'get_db_objects')
objects = self.mox.CreateMockAnything()
dbapi.get_db_objects(models.ImageExists, fake_request, 'id',
custom_filters={}).AndReturn(objects)
self.mox.StubOutWithMock(dbapi, '_convert_model_list')
dbapi._convert_model_list(objects, dbapi._exists_extra_values)
self.mox.ReplayAll()
resp = dbapi.list_usage_exists_glance(fake_request)
self.assertEqual(resp.status_code, 200)
self.mox.VerifyAll()
def test_list_usage_exists_with_received_min(self):
fake_request = self.mox.CreateMockAnything()
date = str(datetime.datetime.utcnow())
@ -361,15 +385,16 @@ class DBAPITestCase(StacktachBaseTestCase):
fake_request = self.mox.CreateMockAnything()
date = str(datetime.datetime.utcnow())
fake_request.GET = {'received_max': date}
self.mox.StubOutWithMock(dbapi, 'get_db_objects')
unix_date = stacktach_utils.str_time_to_unix(date)
custom_filters = {'received_max': {'raw__when__lte': unix_date}}
objects = self.mox.CreateMockAnything()
self.mox.StubOutWithMock(dbapi, 'get_db_objects')
dbapi.get_db_objects(models.InstanceExists, fake_request, 'id',
custom_filters=custom_filters).AndReturn(objects)
self.mox.StubOutWithMock(dbapi, '_convert_model_list')
dbapi._convert_model_list(objects, dbapi._exists_extra_values)
self.mox.ReplayAll()
resp = dbapi.list_usage_exists(fake_request)
self.assertEqual(resp.status_code, 200)
self.mox.VerifyAll()
@ -543,7 +568,8 @@ class DBAPITestCase(StacktachBaseTestCase):
exists1.send_status = 200
self.mox.VerifyAll()
def test_send_status_batch_accepts_post_for_nova_and_glance_when_version_is_1(self):
def test_send_status_batch_accepts_post_for_nova_and_glance_when_version_is_1(
self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'POST'
fake_request.GET = {'service': 'glance'}
@ -571,14 +597,16 @@ class DBAPITestCase(StacktachBaseTestCase):
models.ImageExists.objects.select_for_update().AndReturn(results1)
exists1A = self.mox.CreateMockAnything()
exists1B = self.mox.CreateMockAnything()
results1.filter(message_id=MESSAGE_ID_2).AndReturn([exists1A, exists1B])
results1.filter(message_id=MESSAGE_ID_2).AndReturn(
[exists1A, exists1B])
exists1A.save()
exists1B.save()
results2 = self.mox.CreateMockAnything()
models.ImageExists.objects.select_for_update().AndReturn(results2)
exists2A = self.mox.CreateMockAnything()
exists2B = self.mox.CreateMockAnything()
results2.filter(message_id=MESSAGE_ID_1).AndReturn([exists2A, exists2B])
results2.filter(message_id=MESSAGE_ID_1).AndReturn(
[exists2A, exists2B])
exists2A.save()
exists2B.save()
trans_obj.__exit__(None, None, None)
@ -589,7 +617,6 @@ class DBAPITestCase(StacktachBaseTestCase):
self.mox.VerifyAll()
def test_send_status_batch_accepts_post_when_version_is_0(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'POST'
@ -734,3 +761,547 @@ class DBAPITestCase(StacktachBaseTestCase):
msg = "'messages' missing from request body"
self.assertEqual(body.get('message'), msg)
self.mox.VerifyAll()
def test_list_usage_launches_without_service(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
self.mox.StubOutWithMock(dbapi, 'get_db_objects')
mock_objects = self.mox.CreateMockAnything()
launches = {'a': 1}
self.mox.StubOutWithMock(dbapi, '_convert_model_list')
dbapi._convert_model_list(mock_objects).AndReturn(launches)
dbapi.get_db_objects(models.InstanceUsage, fake_request,
'launched_at').AndReturn(mock_objects)
self.mox.ReplayAll()
resp = dbapi.list_usage_launches(fake_request)
self.assertEqual(resp.status_code, 200)
self.assertEqual(json.loads(resp.content), {'launches': launches})
self.mox.VerifyAll()
def test_list_usage_launches_for_glance(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
self.mox.StubOutWithMock(dbapi, 'get_db_objects')
mock_objects = self.mox.CreateMockAnything()
launches = {'a': 1}
self.mox.StubOutWithMock(dbapi, '_convert_model_list')
dbapi._convert_model_list(mock_objects).AndReturn(launches)
dbapi.get_db_objects(models.ImageUsage, fake_request,
'created_at').AndReturn(mock_objects)
self.mox.ReplayAll()
resp = dbapi.list_usage_images(fake_request)
self.assertEqual(resp.status_code, 200)
self.assertEqual(json.loads(resp.content), {'images': launches})
self.mox.VerifyAll()
def test_list_usage_launches_for_nova(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
self.mox.StubOutWithMock(dbapi, 'get_db_objects')
mock_objects = self.mox.CreateMockAnything()
launches = {'a': 1}
self.mox.StubOutWithMock(dbapi, '_convert_model_list')
dbapi._convert_model_list(mock_objects).AndReturn(launches)
dbapi.get_db_objects(models.InstanceUsage, fake_request,
'launched_at').AndReturn(mock_objects)
self.mox.ReplayAll()
resp = dbapi.list_usage_launches(fake_request)
self.assertEqual(resp.status_code, 200)
self.assertEqual(json.loads(resp.content), {'launches': launches})
self.mox.VerifyAll()
def test_get_usage_launch_with_no_service(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
launch = {'a': 1}
self.mox.StubOutWithMock(dbapi, '_get_model_by_id')
dbapi._get_model_by_id(models.InstanceUsage, 1).AndReturn(launch)
self.mox.ReplayAll()
resp = dbapi.get_usage_launch(fake_request, 1)
self.assertEqual(resp.status_code, 200)
self.assertEqual(json.loads(resp.content), {'launch': {'a': 1}})
self.mox.VerifyAll()
def test_get_usage_launch_for_nova(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
launch = {'a': 1}
self.mox.StubOutWithMock(dbapi, '_get_model_by_id')
dbapi._get_model_by_id(models.InstanceUsage, 1).AndReturn(launch)
self.mox.ReplayAll()
resp = dbapi.get_usage_launch(fake_request, 1)
self.assertEqual(resp.status_code, 200)
self.assertEqual(json.loads(resp.content), {'launch': {'a': 1}})
self.mox.VerifyAll()
def test_get_usage_launch_for_glance(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
launch = {'a': 1}
self.mox.StubOutWithMock(dbapi, '_get_model_by_id')
dbapi._get_model_by_id(models.ImageUsage, 1).AndReturn(launch)
self.mox.ReplayAll()
resp = dbapi.get_usage_image(fake_request, 1)
self.assertEqual(resp.status_code, 200)
self.assertEqual(json.loads(resp.content), {'launch': {'a': 1}})
self.mox.VerifyAll()
def test_get_usage_delete_for_nova(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
delete = {'a': 1}
self.mox.StubOutWithMock(dbapi, '_get_model_by_id')
dbapi._get_model_by_id(models.InstanceDeletes, 1).AndReturn(delete)
self.mox.ReplayAll()
resp = dbapi.get_usage_delete(fake_request, 1)
self.assertEqual(resp.status_code, 200)
self.assertEqual(json.loads(resp.content), {'delete': {'a': 1}})
self.mox.VerifyAll()
def test_get_usage_delete_for_glance(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
delete = {'a': 1}
self.mox.StubOutWithMock(dbapi, '_get_model_by_id')
dbapi._get_model_by_id(models.ImageDeletes, 1).AndReturn(delete)
self.mox.ReplayAll()
resp = dbapi.get_usage_delete_glance(fake_request, 1)
self.assertEqual(resp.status_code, 200)
self.assertEqual(json.loads(resp.content), {'delete': {'a': 1}})
self.mox.VerifyAll()
def test_list_usage_deletes_with_no_service(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
self.mox.StubOutWithMock(dbapi, 'get_db_objects')
mock_objects = self.mox.CreateMockAnything()
deletes = {'a': 1}
self.mox.StubOutWithMock(dbapi, '_convert_model_list')
dbapi._convert_model_list(mock_objects).AndReturn(deletes)
dbapi.get_db_objects(models.InstanceDeletes, fake_request,
'launched_at').AndReturn(mock_objects)
self.mox.ReplayAll()
resp = dbapi.list_usage_deletes(fake_request)
self.assertEqual(resp.status_code, 200)
self.assertEqual(json.loads(resp.content), {'deletes': deletes})
self.mox.VerifyAll()
def test_list_usage_deletes_for_nova(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
self.mox.StubOutWithMock(dbapi, 'get_db_objects')
mock_objects = self.mox.CreateMockAnything()
deletes = {'a': 1}
self.mox.StubOutWithMock(dbapi, '_convert_model_list')
dbapi._convert_model_list(mock_objects).AndReturn(deletes)
dbapi.get_db_objects(models.InstanceDeletes, fake_request,
'launched_at').AndReturn(mock_objects)
self.mox.ReplayAll()
resp = dbapi.list_usage_deletes(fake_request)
self.assertEqual(resp.status_code, 200)
self.assertEqual(json.loads(resp.content), {'deletes': deletes})
self.mox.VerifyAll()
def test_list_usage_deletes_for_glance(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
self.mox.StubOutWithMock(dbapi, 'get_db_objects')
mock_objects = self.mox.CreateMockAnything()
deletes = {'a': 1}
self.mox.StubOutWithMock(dbapi, '_convert_model_list')
dbapi._convert_model_list(mock_objects).AndReturn(deletes)
dbapi.get_db_objects(models.ImageDeletes, fake_request,
'deleted_at').AndReturn(mock_objects)
self.mox.ReplayAll()
resp = dbapi.list_usage_deletes_glance(fake_request)
self.assertEqual(resp.status_code, 200)
self.assertEqual(json.loads(resp.content), {'deletes': deletes})
self.mox.VerifyAll()
def test_get_usage_exist_stats_nova(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
query = self.mox.CreateMockAnything()
models.InstanceExists.objects.filter().AndReturn(query)
query.values('status', 'send_status').AndReturn(query)
result = [
{'status': 'verified', 'send_status': 201L, 'event_count': 2},
{'status': 'failed', 'send_status': 0L, 'event_count': 1}
]
query.annotate(event_count=mox.IsA(Count)).AndReturn(result)
self.mox.ReplayAll()
response = dbapi.get_usage_exist_stats(fake_request)
self.assertEqual(response.status_code, 200)
expected_response = json.dumps({'stats': result})
self.assertEqual(expected_response, response.content)
self.mox.VerifyAll()
def test_get_usage_exist_stats_nova_received_min(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
now = datetime.datetime.utcnow()
fake_request.GET = {'received_min': str(now)}
query = self.mox.CreateMockAnything()
filters = {'raw__when__gte': utils.decimal_utc(now)}
models.InstanceExists.objects.filter(**filters).AndReturn(query)
query.values('status', 'send_status').AndReturn(query)
result = [
{'status': 'verified', 'send_status': 201L, 'event_count': 2},
{'status': 'failed', 'send_status': 0L, 'event_count': 1}
]
query.annotate(event_count=mox.IsA(Count)).AndReturn(result)
self.mox.ReplayAll()
response = dbapi.get_usage_exist_stats(fake_request)
self.assertEqual(response.status_code, 200)
expected_response = json.dumps({'stats': result})
self.assertEqual(expected_response, response.content)
self.mox.VerifyAll()
def test_get_usage_exist_stats_nova_received_max(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
now = datetime.datetime.utcnow()
fake_request.GET = {'received_max': str(now)}
query = self.mox.CreateMockAnything()
filters = {'raw__when__lte': utils.decimal_utc(now)}
models.InstanceExists.objects.filter(**filters).AndReturn(query)
query.values('status', 'send_status').AndReturn(query)
result = [
{'status': 'verified', 'send_status': 201L, 'event_count': 2},
{'status': 'failed', 'send_status': 0L, 'event_count': 1}
]
query.annotate(event_count=mox.IsA(Count)).AndReturn(result)
self.mox.ReplayAll()
response = dbapi.get_usage_exist_stats(fake_request)
self.assertEqual(response.status_code, 200)
expected_response = json.dumps({'stats': result})
self.assertEqual(expected_response, response.content)
self.mox.VerifyAll()
def test_get_usage_exist_stats_nova_class_field_filter(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
now = datetime.datetime.utcnow()
fake_request.GET = {'audit_period_ending_min': str(now)}
query = self.mox.CreateMockAnything()
models.InstanceExists._meta.get_field_by_name('audit_period_ending')
filters = {'audit_period_ending__gte': utils.decimal_utc(now)}
models.InstanceExists.objects.filter(**filters).AndReturn(query)
query.values('status', 'send_status').AndReturn(query)
result = [
{'status': 'verified', 'send_status': 201L, 'event_count': 2},
{'status': 'failed', 'send_status': 0L, 'event_count': 1}
]
query.annotate(event_count=mox.IsA(Count)).AndReturn(result)
self.mox.ReplayAll()
response = dbapi.get_usage_exist_stats(fake_request)
self.assertEqual(response.status_code, 200)
expected_response = json.dumps({'stats': result})
self.assertEqual(expected_response, response.content)
self.mox.VerifyAll()
def test_get_usage_exist_stats_glance(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {}
query = self.mox.CreateMockAnything()
models.ImageExists.objects.filter().AndReturn(query)
query.values('status', 'send_status').AndReturn(query)
result = [
{'status': 'verified', 'send_status': 201L, 'event_count': 2},
{'status': 'failed', 'send_status': 0L, 'event_count': 1}
]
query.annotate(event_count=mox.IsA(Count)).AndReturn(result)
self.mox.ReplayAll()
response = dbapi.get_usage_exist_stats_glance(fake_request)
self.assertEqual(response.status_code, 200)
expected_response = json.dumps({'stats': result})
self.assertEqual(expected_response, response.content)
self.mox.VerifyAll()
def test_get_usage_exist_stats_glance_received_min(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
now = datetime.datetime.utcnow()
fake_request.GET = {'received_min': str(now)}
query = self.mox.CreateMockAnything()
filters = {'raw__when__gte': utils.decimal_utc(now)}
models.ImageExists.objects.filter(**filters).AndReturn(query)
query.values('status', 'send_status').AndReturn(query)
result = [
{'status': 'verified', 'send_status': 201L, 'event_count': 2},
{'status': 'failed', 'send_status': 0L, 'event_count': 1}
]
query.annotate(event_count=mox.IsA(Count)).AndReturn(result)
self.mox.ReplayAll()
response = dbapi.get_usage_exist_stats_glance(fake_request)
self.assertEqual(response.status_code, 200)
expected_response = json.dumps({'stats': result})
self.assertEqual(expected_response, response.content)
self.mox.VerifyAll()
def test_get_usage_exist_stats_glance_received_max(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
now = datetime.datetime.utcnow()
fake_request.GET = {'received_max': str(now)}
query = self.mox.CreateMockAnything()
filters = {'raw__when__lte': utils.decimal_utc(now)}
models.ImageExists.objects.filter(**filters).AndReturn(query)
query.values('status', 'send_status').AndReturn(query)
result = [
{'status': 'verified', 'send_status': 201L, 'event_count': 2},
{'status': 'failed', 'send_status': 0L, 'event_count': 1}
]
query.annotate(event_count=mox.IsA(Count)).AndReturn(result)
self.mox.ReplayAll()
response = dbapi.get_usage_exist_stats_glance(fake_request)
self.assertEqual(response.status_code, 200)
expected_response = json.dumps({'stats': result})
self.assertEqual(expected_response, response.content)
self.mox.VerifyAll()
def test_get_usage_exist_stats_glance_class_field_filter(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
now = datetime.datetime.utcnow()
fake_request.GET = {'audit_period_ending_min': str(now)}
query = self.mox.CreateMockAnything()
models.ImageExists._meta.get_field_by_name('audit_period_ending')
filters = {'audit_period_ending__gte': utils.decimal_utc(now)}
models.ImageExists.objects.filter(**filters).AndReturn(query)
query.values('status', 'send_status').AndReturn(query)
result = [
{'status': 'verified', 'send_status': 201L, 'event_count': 2},
{'status': 'failed', 'send_status': 0L, 'event_count': 1}
]
query.annotate(event_count=mox.IsA(Count)).AndReturn(result)
self.mox.ReplayAll()
response = dbapi.get_usage_exist_stats_glance(fake_request)
self.assertEqual(response.status_code, 200)
expected_response = json.dumps({'stats': result})
self.assertEqual(expected_response, response.content)
self.mox.VerifyAll()
def test_get_event_stats(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {'service': "nova"}
mock_query = self.mox.CreateMockAnything()
models.RawData.objects.values('event').AndReturn(mock_query)
events = [
{'event': 'compute.instance.exists.verified', 'event_count': 100},
{'event': 'compute.instance.exists', 'event_count': 100}
]
mock_query.annotate(event_count=mox.IsA(Count)).AndReturn(events)
self.mox.ReplayAll()
response = dbapi.get_event_stats(fake_request)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content,
json.dumps({'stats': events}))
self.mox.VerifyAll()
def test_get_event_stats_date_range(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
start = "2014-02-26 00:00:00"
end = "2014-02-27 00:00:00"
fake_request.GET = {'when_min': start,
'when_max': end,
'service': "nova"}
mock_query = self.mox.CreateMockAnything()
filters = {
'when__gte': stacktach_utils.str_time_to_unix(start),
'when__lte': stacktach_utils.str_time_to_unix(end)
}
models.RawData.objects.filter(**filters).AndReturn(mock_query)
mock_query.values('event').AndReturn(mock_query)
events = [
{'event': 'compute.instance.exists.verified', 'event_count': 100},
{'event': 'compute.instance.exists', 'event_count': 100}
]
mock_query.annotate(event_count=mox.IsA(Count)).AndReturn(events)
self.mox.ReplayAll()
response = dbapi.get_event_stats(fake_request)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content,
json.dumps({'stats': events}))
self.mox.VerifyAll()
def test_get_verified_count(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {'service': "nova",
'event': 'compute.instance.exists.verified'}
mock_query = self.mox.CreateMockAnything()
models.RawData.objects.values('event').AndReturn(mock_query)
events = [
{'event': 'compute.instance.exists.verified', 'event_count': 100},
{'event': 'compute.instance.exists', 'event_count': 100}
]
mock_query.annotate(event_count=mox.IsA(Count)).AndReturn(events)
self.mox.ReplayAll()
response = dbapi.get_event_stats(fake_request)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content,
json.dumps({'stats': [events[0]]}))
self.mox.VerifyAll()
def test_get_verified_count_default(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {'service': "nova",
'event': 'compute.instance.exists.verified'}
mock_query = self.mox.CreateMockAnything()
models.RawData.objects.values('event').AndReturn(mock_query)
events = [
{'event': 'compute.instance.create.start', 'event_count': 100},
{'event': 'compute.instance.exists', 'event_count': 100}
]
mock_query.annotate(event_count=mox.IsA(Count)).AndReturn(events)
self.mox.ReplayAll()
response = dbapi.get_event_stats(fake_request)
self.assertEqual(response.status_code, 200)
self.assertEqual(response.content,
json.dumps({'stats': [{'event': 'compute.instance.exists.verified', 'event_count': 0}]}))
self.mox.VerifyAll()
def test_get_verified_count_only_one_range_param_returns_400(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {'when_min': "2014-020-26",
'service': "nova"}
self.mox.ReplayAll()
response = dbapi.get_event_stats(fake_request)
self.assertEqual(response.status_code, 400)
self.assertEqual(json.loads(response.content)['message'],
"When providing date range filters, "
"a min and max are required.")
self.mox.VerifyAll()
def test_get_verified_count_only_large_date_range_returns_400(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {'when_min': "2014-2-26 00:00:00",
'when_max': "2014-3-5 00:00:01", # > 7 days later
'service': "nova"}
self.mox.ReplayAll()
response = dbapi.get_event_stats(fake_request)
self.assertEqual(response.status_code, 400)
self.assertEqual(json.loads(response.content)['message'],
"Date ranges may be no larger than 604800 seconds")
self.mox.VerifyAll()
def test_get_verified_count_wrong_date_format_returns_400(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {'when_min': "2014-020-26",
'when_max': "2014-020-26",
'service': "nova"}
self.mox.ReplayAll()
response = dbapi.get_event_stats(fake_request)
self.assertEqual(response.status_code, 400)
self.assertEqual(json.loads(response.content)['message'],
"Invalid format for date"
" (Correct format should be %Y-%m-%d %H:%M:%S)")
self.mox.VerifyAll()
def test_get_verified_count_wrong_service_returns_400(self):
fake_request = self.mox.CreateMockAnything()
fake_request.method = 'GET'
fake_request.GET = {'when_min': "2014-02-26 00:00:00",
"when_max": "2014-02-27 00:00:00",
'service': "qonos"}
self.mox.ReplayAll()
response = dbapi.get_event_stats(fake_request)
self.assertEqual(response.status_code, 400)
self.assertEqual(json.loads(response.content)['message'],
"Invalid service")
self.mox.VerifyAll()
class StacktachRepairScenarioApi(StacktachBaseTestCase):
def setUp(self):
self.mox = mox.Mox()
def tearDown(self):
self.mox.UnsetStubs()
def test_change_nova_exists_status_for_all_exists(self):
request = self.mox.CreateMockAnything()
request.POST = self.mox.CreateMockAnything()
message_ids = ["04fd94b5-64dd-4559-83b7-981d9d4f7a5a",
"14fd94b5-64dd-4559-83b7-981d9d4f7a5a",
"24fd94b5-64dd-4559-83b7-981d9d4f7a5a"]
request.POST._iterlists().AndReturn([('service', ['nova']),
('message_ids', message_ids)])
self.mox.StubOutWithMock(models.InstanceExists,
'mark_exists_as_sent_unverified')
models.InstanceExists.mark_exists_as_sent_unverified(message_ids).\
AndReturn([[], []])
self.mox.ReplayAll()
response = dbapi.repair_stacktach_down(request)
self.assertEqual(response.status_code, 200)
response_data = json.loads(response.content)
self.assertEqual(response_data['exists_not_pending'], [])
self.assertEqual(response_data['absent_exists'], [])
self.mox.VerifyAll()
def test_change_glance_exists_status_for_all_exists(self):
request = self.mox.CreateMockAnything()
request.POST = self.mox.CreateMockAnything()
message_ids = ['04fd94b5-64dd-4559-83b7-981d9d4f7a5a',
'14fd94b5-64dd-4559-83b7-981d9d4f7a5a',
'24fd94b5-64dd-4559-83b7-981d9d4f7a5a']
request.POST._iterlists().AndReturn([('service', ['glance']),
('message_ids', message_ids)])
self.mox.StubOutWithMock(models.ImageExists,
'mark_exists_as_sent_unverified')
models.ImageExists.mark_exists_as_sent_unverified(message_ids).\
AndReturn([[], []])
self.mox.ReplayAll()
response = dbapi.repair_stacktach_down(request)
self.assertEqual(response.status_code, 200)
response_data = json.loads(response.content)
self.assertEqual(response_data['exists_not_pending'], [])
self.assertEqual(response_data['absent_exists'], [])
self.mox.VerifyAll()

@ -17,11 +17,9 @@
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
# IN THE SOFTWARE.
from datetime import datetime
import datetime
import decimal
import json
import logging
import uuid
import kombu
@ -31,7 +29,7 @@ from stacktach import datetime_to_decimal as dt
from stacktach import stacklog
from stacktach import models
from tests.unit import StacktachBaseTestCase
from utils import IMAGE_UUID_1
from utils import IMAGE_UUID_1, SIZE_1, SIZE_2, CREATED_AT_1, CREATED_AT_2
from utils import GLANCE_VERIFIER_EVENT_TYPE
from utils import make_verifier_config
from verifier import glance_verifier
@ -87,8 +85,8 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
def test_verify_usage_created_at_mismatch(self):
exist = self.mox.CreateMockAnything()
exist.usage = self.mox.CreateMockAnything()
exist.created_at = decimal.Decimal('1.1')
exist.usage.created_at = decimal.Decimal('2.1')
exist.created_at = CREATED_AT_1
exist.usage.created_at = CREATED_AT_2
self.mox.ReplayAll()
with self.assertRaises(FieldMismatch) as cm:
@ -96,8 +94,8 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
exception = cm.exception
self.assertEqual(exception.field_name, 'created_at')
self.assertEqual(exception.expected, decimal.Decimal('1.1'))
self.assertEqual(exception.actual, decimal.Decimal('2.1'))
self.assertEqual(exception.expected, CREATED_AT_1)
self.assertEqual(exception.actual, CREATED_AT_2)
self.mox.VerifyAll()
@ -119,10 +117,10 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
def test_verify_usage_size_mismatch(self):
exist = self.mox.CreateMockAnything()
exist.size = 1234
exist.size = SIZE_1
exist.usage = self.mox.CreateMockAnything()
exist.usage.size = 5678
exist.usage.size = SIZE_2
self.mox.ReplayAll()
with self.assertRaises(FieldMismatch) as cm:
@ -130,8 +128,8 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
exception = cm.exception
self.assertEqual(exception.field_name, 'size')
self.assertEqual(exception.expected, 1234)
self.assertEqual(exception.actual, 5678)
self.assertEqual(exception.expected, SIZE_1)
self.assertEqual(exception.actual, SIZE_2)
self.mox.VerifyAll()
@ -255,30 +253,16 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.assertEqual(exception.actual, decimal.Decimal('4.1'))
self.mox.VerifyAll()
def test_verify_for_delete_size_mismatch(self):
exist = self.mox.CreateMockAnything()
exist.delete = self.mox.CreateMockAnything()
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.delete.launched_at = decimal.Decimal('1.1')
exist.delete.deleted_at = decimal.Decimal('6.1')
def test_should_verify_that_image_size_in_exist_is_not_null(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
try:
glance_verifier._verify_for_delete(exist)
self.fail()
except FieldMismatch, fm:
self.assertEqual(fm.field_name, 'deleted_at')
self.assertEqual(fm.expected, decimal.Decimal('5.1'))
self.assertEqual(fm.actual, decimal.Decimal('6.1'))
self.mox.VerifyAll()
def test_should_verify_that_image_size_in_exist_is_not_null(self):
exist = self.mox.CreateMockAnything()
exist.id = 23
exist.size = None
exist.created_at = decimal.Decimal('5.1')
exist.uuid = 'abcd1234'
exist.uuid = '1234-5678-9012-3456'
self.mox.ReplayAll()
try:
@ -286,26 +270,40 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.fail()
except NullFieldException as nf:
self.assertEqual(nf.field_name, 'image_size')
self.assertEqual(nf.reason, "image_size field was null for exist id 23")
self.assertEqual(
nf.reason, "Failed at 2014-01-02 03:04:05 UTC for "
"1234-5678-9012-3456: image_size field was null for "
"exist id 23")
self.mox.VerifyAll()
def test_should_verify_that_created_at_in_exist_is_not_null(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-01 01:02:03')
self.mox.ReplayAll()
exist = self.mox.CreateMockAnything()
exist.id = 23
exist.size = 'size'
exist.created_at = None
exist.uuid = 'abcd1234'
exist.uuid = '1234-5678-9012-3456'
self.mox.ReplayAll()
try:
with self.assertRaises(NullFieldException) as nfe:
glance_verifier._verify_validity(exist)
self.fail()
except NullFieldException as nf:
self.assertEqual(nf.field_name, 'created_at')
self.assertEqual(nf.reason, "created_at field was null for exist id 23")
exception = nfe.exception
self.assertEqual(exception.field_name, 'created_at')
self.assertEqual(exception.reason,
"Failed at 2014-01-01 01:02:03 UTC for "
"1234-5678-9012-3456: created_at field was "
"null for exist id 23")
self.mox.VerifyAll()
def test_should_verify_that_uuid_in_exist_is_not_null(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-01 01:02:03')
self.mox.ReplayAll()
exist = self.mox.CreateMockAnything()
exist.id = 23
exist.size = 'size'
@ -318,15 +316,21 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.fail()
except NullFieldException as nf:
self.assertEqual(nf.field_name, 'uuid')
self.assertEqual(nf.reason, "uuid field was null for exist id 23")
self.assertEqual(
nf.reason, "Failed at 2014-01-01 01:02:03 UTC for None: "
"uuid field was null for exist id 23")
self.mox.VerifyAll()
def test_should_verify_that_owner_in_exist_is_not_null(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
exist = self.mox.CreateMockAnything()
exist.id = 23
exist.size = 1234
exist.created_at = decimal.Decimal('5.1')
exist.uuid = 'abcd1234'
exist.uuid = '1234-5678-9012-3456'
exist.owner = None
self.mox.ReplayAll()
@ -335,10 +339,16 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.fail()
except NullFieldException as nf:
self.assertEqual(nf.field_name, 'owner')
self.assertEqual(nf.reason, "owner field was null for exist id 23")
self.assertEqual(
nf.reason, "Failed at 2014-01-02 03:04:05 UTC for "
"1234-5678-9012-3456: owner field was null for exist id 23")
self.mox.VerifyAll()
def test_should_verify_that_uuid_value_is_uuid_like(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
exist = self.mox.CreateMockAnything()
exist.id = 23
exist.size = 'size'
@ -351,10 +361,17 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.fail()
except WrongTypeException as wt:
self.assertEqual(wt.field_name, 'uuid')
self.assertEqual(wt.reason, "{ uuid : asdfe-fgh } of incorrect type for exist id 23")
self.assertEqual(
wt.reason,
"Failed at 2014-01-02 03:04:05 UTC for None: "
"{uuid: asdfe-fgh} was of incorrect type for exist id 23")
self.mox.VerifyAll()
def test_should_verify_created_at_is_decimal(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
exist = self.mox.CreateMockAnything()
exist.id = 23
exist.size = 'size'
@ -367,13 +384,21 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.fail()
except WrongTypeException as wt:
self.assertEqual(wt.field_name, 'created_at')
self.assertEqual(wt.reason, "{ created_at : 123.a } of incorrect type for exist id 23")
self.assertEqual(
wt.reason,
"Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: {created_at: 123.a} was "
"of incorrect type for exist id 23")
self.mox.VerifyAll()
def test_should_verify_image_size_is_of_type_decimal(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
exist = self.mox.CreateMockAnything()
exist.id = 23
exist.size = 'size'
exist.size = 'random'
exist.created_at = decimal.Decimal('5.1')
exist.uuid = "58fb036d-5ef8-47a8-b503-7571276c400a"
self.mox.ReplayAll()
@ -383,10 +408,18 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.fail()
except WrongTypeException as wt:
self.assertEqual(wt.field_name, 'size')
self.assertEqual(wt.reason, "{ size : size } of incorrect type for exist id 23")
self.assertEqual(
wt.reason,
"Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: {size: random} was "
"of incorrect type for exist id 23")
self.mox.VerifyAll()
def test_should_verify_owner_is_of_type_hex(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
exist = self.mox.CreateMockAnything()
exist.id = 23
exist.size = 1234L
@ -400,7 +433,12 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.fail()
except WrongTypeException as wt:
self.assertEqual(wt.field_name, 'owner')
self.assertEqual(wt.reason, "{ owner : 3762854cd6f6435998188d5120e4c271,kl } of incorrect type for exist id 23")
self.assertEqual(
wt.reason,
"Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: "
"{owner: 3762854cd6f6435998188d5120e4c271,kl} was of "
"incorrect type for exist id 23")
self.mox.VerifyAll()
def test_should_verify_correctly_for_all_non_null_and_valid_types(self):
@ -435,10 +473,9 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.assertTrue(verified)
def test_verify_exist_marks_exist_failed_if_field_mismatch_exception(self):
mock_logger = self._setup_mock_logger()
self.mox.StubOutWithMock(mock_logger, 'info')
mock_logger.exception("glance: Expected field to be 'expected' "
"got 'actual'")
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-01 01:01:01')
self.mox.ReplayAll()
exist1 = self.mox.CreateMockAnything()
exist2 = self.mox.CreateMockAnything()
@ -446,11 +483,13 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.mox.StubOutWithMock(glance_verifier, '_verify_for_usage')
self.mox.StubOutWithMock(glance_verifier, '_verify_for_delete')
self.mox.StubOutWithMock(glance_verifier, '_verify_validity')
field_mismatch_exc = FieldMismatch('field', 'expected', 'actual')
field_mismatch_exc = FieldMismatch('field', 'expected',
'actual', 'uuid')
glance_verifier._verify_for_usage(exist1).AndRaise(
exception=field_mismatch_exc)
exist1.mark_failed(reason='FieldMismatch')
exist1.mark_failed(
reason="Failed at 2014-01-01 01:01:01 UTC for uuid: Expected "
"field to be 'expected' got 'actual'")
glance_verifier._verify_for_usage(exist2)
glance_verifier._verify_for_delete(exist2)
@ -462,25 +501,41 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.mox.VerifyAll()
self.assertFalse(verified)
def test_verify_for_range_without_callback(self):
def test_verify_for_range_without_callback_for_sent_unverified(self):
mock_logger = self._setup_mock_logger()
self.mox.StubOutWithMock(mock_logger, 'info')
stacklog.get_logger('verifier', is_parent=False).AndReturn(mock_logger)
mock_logger.info('glance: Adding 2 per-owner exists to queue.')
when_max = datetime.utcnow()
mock_logger.info('glance: Adding 2 per-owner exists to queue.')
when_max = datetime.datetime.utcnow()
models.ImageExists.VERIFYING = 'verifying'
models.ImageExists.PENDING = 'pending'
models.ImageExists.SENT_VERIFYING = 'sent_verifying'
models.ImageExists.SENT_UNVERIFIED = 'sent_unverified'
self.mox.StubOutWithMock(models.ImageExists, 'find')
exist1 = self.mox.CreateMockAnything()
exist2 = self.mox.CreateMockAnything()
exist3 = self.mox.CreateMockAnything()
exist4 = self.mox.CreateMockAnything()
exist5 = self.mox.CreateMockAnything()
results = {'owner1': [exist1, exist2], 'owner2': [exist3]}
sent_results = {'owner1': [exist4], 'owner2': [exist5]}
models.ImageExists.find_and_group_by_owner_and_raw_id(
ending_max=when_max,
status=models.ImageExists.SENT_UNVERIFIED).AndReturn(sent_results)
models.ImageExists.find_and_group_by_owner_and_raw_id(
ending_max=when_max,
status=models.ImageExists.PENDING).AndReturn(results)
exist1.save()
exist2.save()
exist3.save()
exist4.save()
exist5.save()
self.pool.apply_async(glance_verifier._verify,
args=([exist4],), callback=None)
self.pool.apply_async(glance_verifier._verify, args=([exist5],),
callback=None)
self.pool.apply_async(glance_verifier._verify,
args=([exist1, exist2],), callback=None)
self.pool.apply_async(glance_verifier._verify, args=([exist3],),
@ -491,21 +546,29 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
self.assertEqual(exist1.status, 'verifying')
self.assertEqual(exist2.status, 'verifying')
self.assertEqual(exist3.status, 'verifying')
self.assertEqual(exist4.status, 'sent_verifying')
self.assertEqual(exist5.status, 'sent_verifying')
self.mox.VerifyAll()
def test_verify_for_range_with_callback(self):
mock_logger = self._setup_mock_logger()
self.mox.StubOutWithMock(mock_logger, 'info')
stacklog.get_logger('verifier', is_parent=False).AndReturn(mock_logger)
mock_logger.info('glance: Adding 0 per-owner exists to queue.')
mock_logger.info('glance: Adding 2 per-owner exists to queue.')
callback = self.mox.CreateMockAnything()
when_max = datetime.utcnow()
when_max = datetime.datetime.utcnow()
models.ImageExists.SENT_VERIFYING = 'sent_verifying'
models.ImageExists.SENT_UNVERIFIED = 'sent_unverified'
models.ImageExists.PENDING = 'pending'
models.ImageExists.VERIFYING = 'verifying'
exist1 = self.mox.CreateMockAnything()
exist2 = self.mox.CreateMockAnything()
exist3 = self.mox.CreateMockAnything()
results = {'owner1': [exist1, exist2], 'owner2': [exist3]}
models.ImageExists.find_and_group_by_owner_and_raw_id(
ending_max=when_max,
status=models.ImageExists.SENT_UNVERIFIED).AndReturn([])
models.ImageExists.find_and_group_by_owner_and_raw_id(
ending_max=when_max,
status=models.ImageExists.PENDING).AndReturn(results)
@ -539,8 +602,8 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
]
exist_str = json.dumps(exist_dict)
exist.raw.json = exist_str
exist.audit_period_beginning = datetime(2013, 10, 10)
exist.audit_period_ending = datetime(2013, 10, 10, 23, 59, 59)
exist.audit_period_beginning = datetime.datetime(2013, 10, 10)
exist.audit_period_ending = datetime.datetime(2013, 10, 10, 23, 59, 59)
exist.owner = "1"
self.mox.StubOutWithMock(uuid, 'uuid4')
uuid.uuid4().AndReturn('some_other_uuid')
@ -581,8 +644,8 @@ class GlanceVerifierTestCase(StacktachBaseTestCase):
]
exist_str = json.dumps(exist_dict)
exist.raw.json = exist_str
exist.audit_period_beginning = datetime(2013, 10, 10)
exist.audit_period_ending = datetime(2013, 10, 10, 23, 59, 59)
exist.audit_period_beginning = datetime.datetime(2013, 10, 10)
exist.audit_period_ending = datetime.datetime(2013, 10, 10, 23, 59, 59)
exist.owner = "1"
self.mox.StubOutWithMock(kombu.pools, 'producers')
self.mox.StubOutWithMock(kombu.common, 'maybe_declare')

@ -112,6 +112,81 @@ class ImageExistsTestCase(unittest.TestCase):
'owner1-3': [exist4],
'owner2-2': [exist2]})
def test_mark_exists_as_sent_unverified(self):
message_ids = ["0708cb0b-6169-4d7c-9f58-3cf3d5bf694b",
"9156b83e-f684-4ec3-8f94-7e41902f27aa"]
exist1 = self.mox.CreateMockAnything()
exist1.status = "pending"
exist1.save()
exist2 = self.mox.CreateMockAnything()
exist2.status = "pending"
exist2.save()
exist3 = self.mox.CreateMockAnything()
exist3.status = "pending"
exist3.save()
self.mox.StubOutWithMock(ImageExists.objects, 'filter')
ImageExists.objects.filter(message_id=message_ids[0]).AndReturn(
[exist1, exist2])
ImageExists.objects.filter(message_id=message_ids[1]).AndReturn(
[exist3])
self.mox.ReplayAll()
results = ImageExists.mark_exists_as_sent_unverified(message_ids)
self.assertEqual(results, ([], []))
self.mox.VerifyAll()
def test_mark_exists_as_sent_unverified_return_absent_exists(self):
message_ids = ["0708cb0b-6169-4d7c-9f58-3cf3d5bf694b",
"9156b83e-f684-4ec3-8f94-7e41902f27aa"]
exist1 = self.mox.CreateMockAnything()
exist1.status = "pending"
exist1.save()
exist2 = self.mox.CreateMockAnything()
exist2.status = "pending"
exist2.save()
self.mox.StubOutWithMock(ImageExists.objects, 'filter')
ImageExists.objects.filter(message_id=message_ids[0]).AndReturn(
[exist1, exist2])
ImageExists.objects.filter(message_id=message_ids[1]).AndReturn([])
self.mox.ReplayAll()
results = ImageExists.mark_exists_as_sent_unverified(message_ids)
self.assertEqual(results, (['9156b83e-f684-4ec3-8f94-7e41902f27aa'],
[]))
self.mox.VerifyAll()
def test_mark_exists_as_sent_unverified_and_return_exist_not_pending(self):
message_ids = ["0708cb0b-6169-4d7c-9f58-3cf3d5bf694b",
"9156b83e-f684-4ec3-8f94-7e41902f27aa"]
exist1 = self.mox.CreateMockAnything()
exist1.status = "pending"
exist1.save()
exist2 = self.mox.CreateMockAnything()
exist2.status = "verified"
exist3 = self.mox.CreateMockAnything()
exist3.status = "pending"
exist3.save()
self.mox.StubOutWithMock(ImageExists.objects, 'filter')
ImageExists.objects.filter(message_id=message_ids[0]).AndReturn(
[exist1, exist2])
ImageExists.objects.filter(message_id=message_ids[1]).AndReturn(
[exist3])
self.mox.ReplayAll()
results = ImageExists.mark_exists_as_sent_unverified(message_ids)
self.assertEqual(results, ([],
["0708cb0b-6169-4d7c-9f58-3cf3d5bf694b"]))
self.mox.VerifyAll()
class InstanceExistsTestCase(unittest.TestCase):
def setUp(self):
@ -137,3 +212,66 @@ class InstanceExistsTestCase(unittest.TestCase):
self.mox.VerifyAll()
self.assertEqual(results, [1, 2])
def test_mark_exists_as_sent_unverified(self):
message_ids = ["0708cb0b-6169-4d7c-9f58-3cf3d5bf694b",
"9156b83e-f684-4ec3-8f94-7e41902f27aa"]
exist1 = self.mox.CreateMockAnything()
exist1.status = "pending"
exist1.save()
exist2 = self.mox.CreateMockAnything()
exist2.status = "pending"
exist2.save()
self.mox.StubOutWithMock(InstanceExists.objects, 'get')
InstanceExists.objects.get(message_id=message_ids[0]).AndReturn(exist1)
InstanceExists.objects.get(message_id=message_ids[1]).AndReturn(exist2)
self.mox.ReplayAll()
results = InstanceExists.mark_exists_as_sent_unverified(message_ids)
self.assertEqual(results, ([], []))
self.mox.VerifyAll()
def test_mark_exists_as_sent_unverified_return_absent_exists(self):
message_ids = ["0708cb0b-6169-4d7c-9f58-3cf3d5bf694b",
"9156b83e-f684-4ec3-8f94-7e41902f27aa"]
exist1 = self.mox.CreateMockAnything()
exist1.status = "pending"
exist1.save()
self.mox.StubOutWithMock(InstanceExists.objects, 'get')
InstanceExists.objects.get(message_id=message_ids[0]).AndReturn(exist1)
InstanceExists.objects.get(message_id=message_ids[1]).AndRaise(
Exception)
self.mox.ReplayAll()
results = InstanceExists.mark_exists_as_sent_unverified(message_ids)
self.assertEqual(results, (['9156b83e-f684-4ec3-8f94-7e41902f27aa'],
[]))
self.mox.VerifyAll()
def test_mark_exists_as_sent_unverified_and_return_exist_not_pending(self):
message_ids = ["0708cb0b-6169-4d7c-9f58-3cf3d5bf694b",
"9156b83e-f684-4ec3-8f94-7e41902f27aa"]
exist1 = self.mox.CreateMockAnything()
exist1.status = "pending"
exist1.save()
exist2 = self.mox.CreateMockAnything()
exist2.status = "verified"
self.mox.StubOutWithMock(InstanceExists.objects, 'get')
InstanceExists.objects.get(message_id=message_ids[0]).AndReturn(exist1)
InstanceExists.objects.get(message_id=message_ids[1]).AndReturn(exist2)
self.mox.ReplayAll()
results = InstanceExists.mark_exists_as_sent_unverified(message_ids)
self.assertEqual(results, ([],
["9156b83e-f684-4ec3-8f94-7e41902f27aa"]))
self.mox.VerifyAll()

@ -32,7 +32,7 @@ from stacktach import datetime_to_decimal as dt
from stacktach import stacklog
from stacktach import models
from tests.unit import StacktachBaseTestCase
from utils import make_verifier_config
from utils import make_verifier_config, LAUNCHED_AT_1, INSTANCE_FLAVOR_ID_1, INSTANCE_FLAVOR_ID_2, FLAVOR_FIELD_NAME, DELETED_AT_1, LAUNCHED_AT_2, DELETED_AT_2
from utils import INSTANCE_ID_1
from utils import RAX_OPTIONS_1
from utils import RAX_OPTIONS_2
@ -54,12 +54,14 @@ from verifier import FieldMismatch
from verifier import NotFound
from verifier import VerificationException
class NovaVerifierVerifyForLaunchTestCase(StacktachBaseTestCase):
def setUp(self):
self.mox = mox.Mox()
self.mox.StubOutWithMock(models, 'InstanceUsage',
use_mock_anything=True)
models.InstanceUsage.objects = self.mox.CreateMockAnything()
self._setup_verifier()
def _setup_verifier(self):
@ -132,28 +134,36 @@ class NovaVerifierVerifyForLaunchTestCase(StacktachBaseTestCase):
self.mox.VerifyAll()
def test_verify_for_launch_flavor_id_missmatch(self):
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.usage = self.mox.CreateMockAnything()
exist.launched_at = decimal.Decimal('1.1')
exist.dummy_flavor_field_name = 'dummy_flavor_1'
exist.usage.launched_at = decimal.Decimal('1.1')
exist.usage.dummy_flavor_field_name = 'dummy_flavor_2'
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn(FLAVOR_FIELD_NAME)
exist = self.mox.CreateMockAnything()
exist.instance = INSTANCE_ID_1
exist.usage = self.mox.CreateMockAnything()
exist.launched_at = decimal.Decimal(LAUNCHED_AT_1)
exist.flavor_field_name = INSTANCE_FLAVOR_ID_1
exist.usage.launched_at = decimal.Decimal(LAUNCHED_AT_1)
exist.usage.flavor_field_name = INSTANCE_FLAVOR_ID_2
self.mox.ReplayAll()
with self.assertRaises(FieldMismatch) as fm:
nova_verifier._verify_for_launch(exist)
exception = fm.exception
self.assertEqual(exception.field_name, 'dummy_flavor_field_name')
self.assertEqual(exception.expected, 'dummy_flavor_1')
self.assertEqual(exception.actual, 'dummy_flavor_2')
self.assertEqual(exception.field_name, FLAVOR_FIELD_NAME)
self.assertEqual(exception.expected, INSTANCE_FLAVOR_ID_1)
self.assertEqual(exception.actual, INSTANCE_FLAVOR_ID_2)
self.assertEqual(
exception.reason,
"Failed at 2014-01-02 03:04:05 UTC for "
"08f685d9-6352-4dbc-8271-96cc54bf14cd: Expected flavor_field_name "
"to be '1' got 'performance2-120'")
self.mox.VerifyAll()
def test_verify_for_launch_tenant_id_mismatch(self):
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn("flavor_field_name")
config.flavor_field_name().AndReturn(FLAVOR_FIELD_NAME)
exist = self.mox.CreateMockAnything()
exist.tenant = TENANT_ID_1
@ -425,35 +435,35 @@ class NovaVerifierVerifyForDeleteTestCase(StacktachBaseTestCase):
def test_verify_for_delete_launched_at_mismatch(self):
exist = self.mox.CreateMockAnything()
exist.delete = self.mox.CreateMockAnything()
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.delete.launched_at = decimal.Decimal('2.1')
exist.delete.deleted_at = decimal.Decimal('5.1')
exist.launched_at = LAUNCHED_AT_1
exist.deleted_at = DELETED_AT_1
exist.delete.launched_at = LAUNCHED_AT_2
exist.delete.deleted_at = DELETED_AT_1
self.mox.ReplayAll()
with self.assertRaises(FieldMismatch) as fm:
nova_verifier._verify_for_delete(exist)
exception = fm.exception
self.assertEqual(exception.field_name, 'launched_at')
self.assertEqual(exception.expected, decimal.Decimal('1.1'))
self.assertEqual(exception.actual, decimal.Decimal('2.1'))
self.assertEqual(exception.expected, LAUNCHED_AT_1)
self.assertEqual(exception.actual, LAUNCHED_AT_2)
self.mox.VerifyAll()
def test_verify_for_delete_deleted_at_mismatch(self):
exist = self.mox.CreateMockAnything()
exist.delete = self.mox.CreateMockAnything()
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.delete.launched_at = decimal.Decimal('1.1')
exist.delete.deleted_at = decimal.Decimal('6.1')
exist.launched_at = LAUNCHED_AT_1
exist.deleted_at = DELETED_AT_1
exist.delete.launched_at = LAUNCHED_AT_1
exist.delete.deleted_at = DELETED_AT_2
self.mox.ReplayAll()
with self.assertRaises(FieldMismatch) as fm:
nova_verifier._verify_for_delete(exist)
exception = fm.exception
self.assertEqual(exception.field_name, 'deleted_at')
self.assertEqual(exception.expected, decimal.Decimal('5.1'))
self.assertEqual(exception.actual, decimal.Decimal('6.1'))
self.assertEqual(exception.expected, DELETED_AT_1)
self.assertEqual(exception.actual, DELETED_AT_2)
self.mox.VerifyAll()
@ -798,18 +808,23 @@ class NovaVerifierVerifyTestCase(StacktachBaseTestCase):
self.assertFalse(result)
self.mox.VerifyAll()
def test_verify_for_range_without_callback(self):
mock_logger = self._create_mock_logger()
stacklog.get_logger('verifier', is_parent=False).AndReturn(mock_logger)
stacklog.get_logger('verifier', is_parent=False).AndReturn(mock_logger)
mock_logger.info('nova: Adding 0 exists to queue.')
mock_logger.info('nova: Adding 2 exists to queue.')
when_max = datetime.datetime.utcnow()
results = self.mox.CreateMockAnything()
sent_results = self.mox.CreateMockAnything()
models.InstanceExists.PENDING = 'pending'
models.InstanceExists.VERIFYING = 'verifying'
models.InstanceExists.SENT_UNVERIFIED = 'sent_unverified'
models.InstanceExists.find(
ending_max=when_max, status='sent_unverified').AndReturn(sent_results)
models.InstanceExists.find(
ending_max=when_max, status='pending').AndReturn(results)
sent_results.count().AndReturn(0)
results.count().AndReturn(2)
exist1 = self.mox.CreateMockAnything()
exist2 = self.mox.CreateMockAnything()
@ -827,18 +842,25 @@ class NovaVerifierVerifyTestCase(StacktachBaseTestCase):
self.verifier.verify_for_range(when_max)
self.mox.VerifyAll()
def test_verify_for_range_with_callback(self):
callback = self.mox.CreateMockAnything()
mock_logger = self._create_mock_logger()
stacklog.get_logger('verifier', is_parent=False).AndReturn(mock_logger)
mock_logger.info("nova: Adding 2 exists to queue.")
callback = self.mox.CreateMockAnything()
stacklog.get_logger('verifier', is_parent=False).AndReturn(mock_logger)
mock_logger.info('nova: Adding 0 exists to queue.')
mock_logger.info('nova: Adding 2 exists to queue.')
when_max = datetime.datetime.utcnow()
results = self.mox.CreateMockAnything()
sent_results = self.mox.CreateMockAnything()
models.InstanceExists.PENDING = 'pending'
models.InstanceExists.VERIFYING = 'verifying'
models.InstanceExists.SENT_UNVERIFIED = 'sent_unverified'
models.InstanceExists.find(
ending_max=when_max, status='sent_unverified').AndReturn(sent_results)
models.InstanceExists.find(
ending_max=when_max, status='pending').AndReturn(results)
sent_results.count().AndReturn(0)
results.count().AndReturn(2)
exist1 = self.mox.CreateMockAnything()
exist2 = self.mox.CreateMockAnything()
@ -857,6 +879,37 @@ class NovaVerifierVerifyTestCase(StacktachBaseTestCase):
self.mox.VerifyAll()
def test_verify_for_range_when_found_sent_unverified_messages(self):
callback = self.mox.CreateMockAnything()
when_max = datetime.datetime.utcnow()
results = self.mox.CreateMockAnything()
sent_results = self.mox.CreateMockAnything()
models.InstanceExists.PENDING = 'pending'
models.InstanceExists.VERIFYING = 'verifying'
models.InstanceExists.SENT_VERIFYING = 'sent_verifying'
models.InstanceExists.SENT_UNVERIFIED = 'sent_unverified'
models.InstanceExists.find(
ending_max=when_max, status='sent_unverified').AndReturn(sent_results)
models.InstanceExists.find(
ending_max=when_max, status='pending').AndReturn(results)
sent_results.count().AndReturn(2)
results.count().AndReturn(0)
exist1 = self.mox.CreateMockAnything()
exist2 = self.mox.CreateMockAnything()
sent_results.__getslice__(0, 1000).AndReturn(sent_results)
sent_results.__iter__().AndReturn([exist1, exist2].__iter__())
exist1.update_status('sent_verifying')
exist2.update_status('sent_verifying')
exist1.save()
exist2.save()
self.pool.apply_async(nova_verifier._verify, args=(exist1, 'all'),
callback=None)
self.pool.apply_async(nova_verifier._verify, args=(exist2, 'all'),
callback=None)
self.mox.ReplayAll()
self.verifier.verify_for_range(when_max, callback=callback)
self.mox.VerifyAll()
class NovaVerifierSendVerifiedNotificationTestCase(StacktachBaseTestCase):
def setUp(self):
self.mox = mox.Mox()
@ -969,28 +1022,52 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
def tearDown(self):
self.mox.UnsetStubs()
def _create_mock_exist(self):
exist = self.mox.CreateMockAnything()
exist.instance = '58fb036d-5ef8-47a8-b503-7571276c400a'
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist.rax_options = '1'
exist.os_architecture = 'x64'
exist.os_distro = 'com.microsoft.server'
exist.os_version = '2008.2'
return exist
def test_should_verify_that_tenant_in_exist_is_not_null(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist = self._create_mock_exist()
exist.tenant = None
exist.id = 23
self.mox.ReplayAll()
with self.assertRaises(NullFieldException) as nf:
nova_verifier._verify_validity(exist, 'all')
exception = nf.exception
self.assertEqual(exception.field_name, 'tenant')
self.assertEqual(exception.reason,
"tenant field was null for exist id 23")
self.assertEqual(
exception.reason, "Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: tenant field was null for "
"exist id 23")
self.mox.VerifyAll()
def test_should_verify_that_launched_at_in_exist_is_not_null(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = 'tenant'
exist.id = 23
exist = self._create_mock_exist()
exist.launched_at = None
self.mox.ReplayAll()
@ -998,17 +1075,21 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
nova_verifier._verify_validity(exist, 'all')
exception = nf.exception
self.assertEqual(exception.field_name, 'launched_at')
self.assertEqual(exception.reason,
"launched_at field was null for exist id 23")
self.assertEqual(
exception.reason, "Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: launched_at field was null "
"for exist id 23")
self.mox.VerifyAll()
def test_should_verify_that_instance_flavor_id_in_exist_is_not_null(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist = self._create_mock_exist()
exist.dummy_flavor_field_name = None
self.mox.ReplayAll()
@ -1018,17 +1099,21 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
self.assertEqual(exception.field_name, 'dummy_flavor_field_name')
self.assertEqual(
exception.reason,
"dummy_flavor_field_name field was null for exist id 23")
"Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: dummy_flavor_field_name "
"field was null for exist id 23")
self.mox.VerifyAll()
def test_should_verify_tenant_id_is_of_type_hex(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = 'tenant'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist = self._create_mock_exist()
exist.tenant = 'invalid_tenant'
self.mox.ReplayAll()
with self.assertRaises(WrongTypeException) as wt:
@ -1037,17 +1122,21 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
self.assertEqual(exception.field_name, 'tenant')
self.assertEqual(
exception.reason,
"{ tenant : tenant } of incorrect type for exist id 23")
"Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: {tenant: invalid_tenant} "
"was of incorrect type for exist id 23")
self.mox.VerifyAll()
def test_should_verify_launched_at_is_of_type_decimal(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist = self._create_mock_exist()
exist.launched_at = 111
exist.dummy_flavor_field_name = 'dummy_flavor'
self.mox.ReplayAll()
@ -1057,17 +1146,20 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
self.assertEqual(exception.field_name, 'launched_at')
self.assertEqual(
exception.reason,
"{ launched_at : 111 } of incorrect type for exist id 23")
'Failed at 2014-01-02 03:04:05 UTC for '
'58fb036d-5ef8-47a8-b503-7571276c400a: {launched_at: 111} was of '
'incorrect type for exist id 23')
self.mox.VerifyAll()
def test_should_verify_deleted_at_is_of_decimal_type_if_present(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist = self._create_mock_exist()
exist.deleted_at = 20
self.mox.ReplayAll()
@ -1077,19 +1169,20 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
self.assertEqual(exception.field_name, 'deleted_at')
self.assertEqual(
exception.reason,
"{ deleted_at : 20 } of incorrect type for exist id 23")
'Failed at 2014-01-02 03:04:05 UTC for '
'58fb036d-5ef8-47a8-b503-7571276c400a: {deleted_at: 20} was of '
'incorrect type for exist id 23')
self.mox.VerifyAll()
def test_should_verify_rax_options_should_be_of_integer_type(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist = self._create_mock_exist()
exist.rax_options = 'a'
self.mox.ReplayAll()
@ -1099,18 +1192,20 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
self.assertEqual(exception.field_name, 'rax_options')
self.assertEqual(
exception.reason,
"{ rax_options : a } of incorrect type for exist id 23")
'Failed at 2014-01-02 03:04:05 UTC for '
'58fb036d-5ef8-47a8-b503-7571276c400a: {rax_options: a} was of '
'incorrect type for exist id 23')
self.mox.VerifyAll()
def test_should_verify_rax_options_should_not_be_empty(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist = self._create_mock_exist()
exist.rax_options = ''
self.mox.ReplayAll()
@ -1118,20 +1213,22 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
nova_verifier._verify_validity(exist, 'all')
exception = nf.exception
self.assertEqual(exception.field_name, 'rax_options')
self.assertEqual(exception.reason,
"rax_options field was null for exist id 23")
self.assertEqual(
exception.reason,
"Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: rax_options field was null "
"for exist id 23")
self.mox.VerifyAll()
def test_should_verify_os_arch_should_be_alphanumeric(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist.rax_options = 12
exist = self._create_mock_exist()
exist.os_architecture = 'x64,'
self.mox.ReplayAll()
@ -1141,19 +1238,19 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
self.assertEqual(exception.field_name, 'os_architecture')
self.assertEqual(
exception.reason,
"{ os_architecture : x64, } of incorrect type for exist id 23")
'Failed at 2014-01-02 03:04:05 UTC for '
'58fb036d-5ef8-47a8-b503-7571276c400a: {os_architecture: x64,} '
'was of incorrect type for exist id 23')
self.mox.VerifyAll()
def test_should_verify_os_arch_should_not_be_empty(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist.rax_options = 12
exist = self._create_mock_exist()
exist.os_architecture = ''
self.mox.ReplayAll()
@ -1163,20 +1260,20 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
self.assertEqual(exception.field_name, 'os_architecture')
self.assertEqual(
exception.reason,
"os_architecture field was null for exist id 23")
"Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: os_architecture field was "
"null for exist id 23")
self.mox.VerifyAll()
def test_should_verify_os_distro_should_be_alphanumeric(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist.rax_options = 12
exist.os_architecture = 'x64'
exist = self._create_mock_exist()
exist.os_distro = 'com.microsoft.server,'
self.mox.ReplayAll()
@ -1186,21 +1283,21 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
self.assertEqual(exception.field_name, 'os_distro')
self.assertEqual(
exception.reason,
"{ os_distro : com.microsoft.server, } "
"of incorrect type for exist id 23")
"Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: "
"{os_distro: com.microsoft.server,} was of incorrect type for "
"exist id 23")
self.mox.VerifyAll()
def test_should_verify_os_distro_should_not_be_empty(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist.rax_options = 12
exist.os_architecture = 'x64'
exist = self._create_mock_exist()
exist.os_distro = ''
self.mox.ReplayAll()
@ -1210,21 +1307,20 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
self.assertEqual(exception.field_name, 'os_distro')
self.assertEqual(
exception.reason,
"os_distro field was null for exist id 23")
"Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: os_distro field was null "
"for exist id 23")
self.mox.VerifyAll()
def test_should_verify_os_version_should_be_alphanumeric(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist.rax_options = 12
exist.os_architecture = 'x64'
exist.os_distro = 'com.microsoft.server'
exist = self._create_mock_exist()
exist.os_version = '2008.2,'
self.mox.ReplayAll()
@ -1234,21 +1330,20 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
self.assertEqual(exception.field_name, 'os_version')
self.assertEqual(
exception.reason,
"{ os_version : 2008.2, } of incorrect type for exist id 23")
'Failed at 2014-01-02 03:04:05 UTC for '
'58fb036d-5ef8-47a8-b503-7571276c400a: {os_version: 2008.2,} was '
'of incorrect type for exist id 23')
self.mox.VerifyAll()
def test_should_verify_os_version_should_not_be_empty(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist.rax_options = 12
exist.os_architecture = 'x64'
exist.os_distro = 'com.microsoft.server'
exist = self._create_mock_exist()
exist.os_version = ''
self.mox.ReplayAll()
@ -1257,30 +1352,26 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
exception = nf.exception
self.assertEqual(exception.field_name, 'os_version')
self.assertEqual(
exception.reason, "os_version field was null for exist id 23")
exception.reason,
"Failed at 2014-01-02 03:04:05 UTC for "
"58fb036d-5ef8-47a8-b503-7571276c400a: os_version field was null "
"for exist id 23")
self.mox.VerifyAll()
def test_should_verify_all_exist_fields_when_validity_check_value_is_all(self):
def test_should_verify_all_exist_fields_when_validity_check_value_all(self):
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist.deleted_at = decimal.Decimal('5.1')
exist.dummy_flavor_field_name = 'dummy_flavor'
exist.rax_options = '12'
exist.os_architecture = 'x64'
exist.os_distro = 'com.microsoft.server'
exist.os_version = '2008.2'
exist = self._create_mock_exist()
self.mox.ReplayAll()
nova_verifier._verify_validity(exist, 'all')
self.mox.VerifyAll()
def test_should_verify_only_basic_exist_fields_when_validity_check_value_is_basic(self):
def test_should_verify_only_basic_fields_when_validity_check_basic(self):
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
@ -1303,16 +1394,9 @@ class NovaVerifierValidityTestCase(StacktachBaseTestCase):
def test_should_verify_exist_fields_even_if_deleted_at_is_none(self):
self.mox.StubOutWithMock(config, 'flavor_field_name')
config.flavor_field_name().AndReturn('dummy_flavor_field_name')
exist = self.mox.CreateMockAnything()
exist.tenant = '3762854cd6f6435998188d5120e4c271'
exist.id = 23
exist.launched_at = decimal.Decimal('1.1')
exist = self._create_mock_exist()
exist.deleted_at = None
exist.dummy_flavor_field_name = 'dummy_flavor'
exist.rax_options = 12
exist.os_architecture = 'x64'
exist.os_distro = 'com.microsoft.server'
exist.os_version = '2008.2'
self.mox.ReplayAll()
nova_verifier._verify_validity(exist, 'all')

@ -38,8 +38,7 @@ from utils import TENANT_ID_1
from utils import INSTANCE_TYPE_ID_1
from utils import DUMMY_TIME
from utils import INSTANCE_TYPE_ID_2
from utils import IMAGE_UUID_1
from stacktach import stacklog
from stacktach import stacklog, models
from stacktach import notification
from stacktach import views
from tests.unit import StacktachBaseTestCase

@ -17,6 +17,7 @@
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
# IN THE SOFTWARE.
import ast
import datetime
import decimal
@ -1509,4 +1510,175 @@ class StackyServerTestCase(StacktachBaseTestCase):
actual_results = stacky_server.model_search(fake_request, fake_model,
filters, order_by='when')
self.assertEqual(actual_results, results)
self.mox.VerifyAll()
self.mox.VerifyAll()
class JsonReportsSearchAPI(StacktachBaseTestCase):
def setUp(self):
self.mox = mox.Mox()
self.model = models.JsonReport.objects
self.model_search_result = self.mox.CreateMockAnything()
self.model_search_result.id = '5975'
self.model_search_result.period_start = datetime.datetime(2014, 1, 18,)
self.model_search_result.period_end = datetime.datetime(2014, 1, 19)
self.model_search_result.created = 1388569200
self.model_search_result.name = 'nova usage audit'
self.model_search_result.version = 4
def tearDown(self):
self.mox.UnsetStubs()
def test_jsonreports_search_order_by_id(self):
request = self.mox.CreateMockAnything()
request.GET = {
'id': 1,
'name': 'nova_usage_audit',
'period_start': '2014-01-01 00:00:00',
'period_end': '2014-01-02 00:00:00',
'created': '2014-01-01',
}
filters = {
'id__exact': 1,
'period_start__exact': '2014-01-01 00:00:00',
'name__exact': 'nova_usage_audit',
'period_end__exact': '2014-01-02 00:00:00',
'created__lt': decimal.Decimal('1388620800'),
'created__gt': decimal.Decimal('1388534400'),
}
self.mox.StubOutWithMock(stacky_server, 'model_search')
stacky_server.model_search(request, self.model, filters,
order_by='-id').AndReturn(
[self.model_search_result])
self.mox.ReplayAll()
actual_result = stacky_server.do_jsonreports_search(request).content
expected_result = [
['Id', 'Start', 'End', 'Created', 'Name', 'Version'],
['5975', '2014-01-18 00:00:00', '2014-01-19 00:00:00',
'2014-01-01 09:40:00', 'nova usage audit', 4]
]
self.assertEquals(ast.literal_eval(actual_result), expected_result)
self.mox.VerifyAll()
def test_jsonreports_search_with_limit_offset(self):
request = self.mox.CreateMockAnything()
request.GET = {
'period_start': '2014-01-01 09:40:00',
'name': 'nova_usage_audit',
'limit': 10,
'offset': 5
}
filters = {
'period_start__exact': '2014-01-01 09:40:00',
'name__exact': 'nova_usage_audit',
}
self.mox.StubOutWithMock(stacky_server, 'model_search')
stacky_server.model_search(request, self.model, filters,
order_by='-id').AndReturn(
[self.model_search_result])
self.mox.ReplayAll()
actual_result = stacky_server.do_jsonreports_search(request).content
expected_result = \
[['Id', 'Start', 'End', 'Created', 'Name', 'Version'],
['5975', '2014-01-18 00:00:00', '2014-01-19 00:00:00',
'2014-01-01 09:40:00', 'nova usage audit', 4]]
self.assertEquals(ast.literal_eval(actual_result), expected_result)
self.mox.VerifyAll()
def test_jsonreports_search_with_invalid_fields(self):
request = self.mox.CreateMockAnything()
request.GET = {'invalid_column_1': 'value_1',
'invalid_column_2': 'value_2',
'version': 4,
'json': 'json',
'period_start': '2014-01-01 00:00:00'}
self.mox.ReplayAll()
actual_result = stacky_server.do_jsonreports_search(request).content
expected_result = [
["Error", "Message"],
["Bad Request", "The requested fields either do not exist for the "
"corresponding object or are not searchable: invalid_column_1, "
"invalid_column_2, json, version. Note: The field names of "
"database are case-sensitive."]
]
self.assertEqual(ast.literal_eval(actual_result), expected_result)
self.mox.VerifyAll()
def test_jsonreports_search_with_invalid_period_start(self):
request = self.mox.CreateMockAnything()
request.GET = {'period_start': '1234'}
self.mox.ReplayAll()
actual_result = stacky_server.do_jsonreports_search(request).content
expected_result = [
["Error", "Message"],
["Bad Request", "'1234' value has an invalid format. It must be in "
"YYYY-MM-DD HH:MM[:ss[.uuuuuu]][TZ] format."]
]
self.assertEqual(ast.literal_eval(actual_result), expected_result)
self.mox.VerifyAll()
def test_jsonreports_search_with_invalid_period_end(self):
request = self.mox.CreateMockAnything()
request.GET = {'period_end': '1234'}
self.mox.ReplayAll()
actual_result = stacky_server.do_jsonreports_search(request).content
expected_result = [
["Error", "Message"],
["Bad Request", "'1234' value has an invalid format. It must be in "
"YYYY-MM-DD HH:MM[:ss[.uuuuuu]][TZ] format."]
]
self.assertEqual(ast.literal_eval(actual_result), expected_result)
self.mox.VerifyAll()
def test_jsonreports_search_with_invalid_id(self):
request = self.mox.CreateMockAnything()
request.GET = {'id': 'abcd'}
self.mox.ReplayAll()
actual_result = stacky_server.do_jsonreports_search(request).content
expected_result = [
["Error", "Message"],
["Bad Request", "'abcd' value has an invalid format. It must be in "
"integer format."]
]
self.assertEqual(ast.literal_eval(actual_result), expected_result)
self.mox.VerifyAll()
def test_jsonreports_search_with_invalid_created_format(self):
request = self.mox.CreateMockAnything()
request.GET = {
'created': '2014-01-01 00:00:00'
}
self.mox.ReplayAll()
actual_result = stacky_server.do_jsonreports_search(request).content
expected_result = [
["Error", "Message"],
["Bad Request", "'2014-01-01 00:00:00' value has an invalid format."
" It must be in YYYY-MM-DD format."]
]
self.assertEqual(ast.literal_eval(actual_result), expected_result)
self.mox.VerifyAll()
def test_jsonreports_search_by_invalid_created_400(self):
request = self.mox.CreateMockAnything()
request.GET = {
'created': '1234'}
self.mox.ReplayAll()
actual_result = stacky_server.do_jsonreports_search(request).content
expected_result = \
[
["Error", "Message"],
["Bad Request", "'1234' value has an invalid format. It must be in "
"YYYY-MM-DD format."]
]
self.assertEquals(ast.literal_eval(actual_result), expected_result)
self.mox.VerifyAll()

@ -0,0 +1,60 @@
import datetime
import mox
from tests.unit import StacktachBaseTestCase
from verifier import NotFound, AmbiguousResults, FieldMismatch, NullFieldException, WrongTypeException
class VerificationExceptionTestCase(StacktachBaseTestCase):
def setUp(self):
self.mox = mox.Mox()
def tearDown(self):
self.mox.UnsetStubs()
def test_not_found_exception(self):
exception = NotFound('object_type', 'search_params')
self.assertEqual(exception.reason,
"Couldn't find object_type using search_params")
def test_ambiguous_results_exception(self):
exception = AmbiguousResults('object_type', 'search_params')
self.assertEqual(
exception.reason,
"Ambiguous results for object_type using search_params")
def test_field_mismatch_exception(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
exception = FieldMismatch('field_name', 'expected', 'actual', 'uuid')
self.assertEqual(exception.reason,
"Failed at 2014-01-02 03:04:05 UTC for uuid: Expected"
" field_name to be 'expected' got 'actual'")
def test_null_field_exception(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
exception = NullFieldException('field_name', '1234', 'uuid')
self.assertEqual(exception.reason,
"Failed at 2014-01-02 03:04:05 UTC for uuid: "
"field_name field was null for exist id 1234")
def test_wrong_type_exception(self):
self.mox.StubOutWithMock(datetime, 'datetime')
datetime.datetime.utcnow().AndReturn('2014-01-02 03:04:05')
self.mox.ReplayAll()
exception = WrongTypeException('field_name', 'value', '1234', 'uuid')
self.assertEqual(exception.reason,
"Failed at 2014-01-02 03:04:05 UTC for uuid: "
"{field_name: value} was of incorrect type for"
" exist id 1234")

@ -169,6 +169,7 @@ class ConsumerTestCase(StacktachBaseTestCase):
mock_logger.debug("Processing on 'east_coast.prod.global nova'")
mock_logger.debug("Completed processing on "
"'east_coast.prod.global nova'")
mock_logger.info("Worker exiting.")
config = {
'name': 'east_coast.prod.global',
@ -217,6 +218,7 @@ class ConsumerTestCase(StacktachBaseTestCase):
mock_logger.debug("Processing on 'east_coast.prod.global nova'")
mock_logger.debug("Completed processing on "
"'east_coast.prod.global nova'")
mock_logger.info("Worker exiting.")
config = {
'name': 'east_coast.prod.global',

@ -19,6 +19,7 @@
# IN THE SOFTWARE.
import datetime
import decimal
TENANT_ID_1 = 'testtenantid1'
TENANT_ID_2 = 'testtenantid2'
@ -30,7 +31,7 @@ IMAGE_UUID_1 = "12345678-6352-4dbc-8271-96cc54bf14cd"
INSTANCE_ID_1 = "08f685d9-6352-4dbc-8271-96cc54bf14cd"
INSTANCE_ID_2 = "515adf96-41d3-b86d-5467-e584edc61dab"
INSTANCE_FLAVOR_ID_1 = "performance1-120"
INSTANCE_FLAVOR_ID_1 = "1"
INSTANCE_FLAVOR_ID_2 = "performance2-120"
INSTANCE_TYPE_ID_1 = "12345"
@ -61,6 +62,18 @@ OS_ARCH_2 = "x64"
OS_VERSION_1 = "1"
OS_VERSION_2 = "2"
LAUNCHED_AT_1 = decimal.Decimal("1.1")
LAUNCHED_AT_2 = decimal.Decimal("2.1")
DELETED_AT_1 = decimal.Decimal("3.1")
DELETED_AT_2 = decimal.Decimal("4.1")
SIZE_1 = 1234
SIZE_2 = 4567
CREATED_AT_1 = decimal.Decimal("10.1")
CREATED_AT_2 = decimal.Decimal("11.1")
TIMESTAMP_1 = "2013-06-20 17:31:57.939614"
SETTLE_TIME = 5
SETTLE_UNITS = "minutes"

@ -114,8 +114,8 @@ def _get_deletes(start, session):
def seed(period_length):
start = get_period_start(datetime.datetime.utcnow(), period_length)
db_api.configure_db()
session = db_api.get_session()
db_api.setup_db_env()
session = db_api._get_session()
print "Populating active image usages"
usages = _get_usages(start, session)

@ -17,6 +17,8 @@
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
# IN THE SOFTWARE.
import datetime
class VerificationException(Exception):
def __init__(self, reason):
@ -44,22 +46,35 @@ class AmbiguousResults(VerificationException):
class FieldMismatch(VerificationException):
def __init__(self, field_name, expected, actual):
def __init__(self, field_name, expected, actual, uuid):
self.field_name = field_name
self.expected = expected
self.actual = actual
self.reason = "Expected %s to be '%s' got '%s'" % (self.field_name,
self.expected,
self.actual)
self.reason = \
"Failed at {failed_at} UTC for {uuid}: Expected {field_name} " \
"to be '{expected}' got '{actual}'".\
format(failed_at=datetime.datetime.utcnow(), uuid=uuid,
field_name=field_name, expected=expected,
actual=actual)
class NullFieldException(VerificationException):
def __init__(self, field_name, exist_id):
def __init__(self, field_name, exist_id, uuid):
self.field_name = field_name
self.reason = "%s field was null for exist id %s" %(field_name, exist_id)
self.reason = \
"Failed at {failed_at} UTC for {uuid}: {field_name} field " \
"was null for exist id {exist_id}".format(
failed_at=datetime.datetime.utcnow(), uuid=uuid,
field_name=field_name, exist_id=exist_id)
class WrongTypeException(VerificationException):
def __init__(self, field_name, value, exist_id):
def __init__(self, field_name, value, exist_id, uuid):
self.field_name = field_name
self.reason = "{ %s : %s } of incorrect type for exist id %s"\
%(field_name, value, exist_id)
self.reason = \
"Failed at {failed_at} UTC for {uuid}: " \
"{{{field_name}: {value}}} was of incorrect type for " \
"exist id {exist_id}".format(
failed_at=datetime.datetime.utcnow(), uuid=uuid,
field_name=field_name, value=value, exist_id=exist_id)

@ -81,34 +81,34 @@ def _verify_date_field(d1, d2, same_second=False):
def _is_like_uuid(attr_name, attr_value, exist_id):
if not re.match("[0-9A-Fa-f]{8}-[0-9A-Fa-f]{4}-[0-9A-Fa-f]{4}-[0-9A-Fa-f]{4}-[0-9A-Fa-f]{12}$",
attr_value):
raise WrongTypeException(attr_name, attr_value, exist_id)
raise WrongTypeException(attr_name, attr_value, exist_id, None)
def _is_like_date(attr_name, attr_value, exist_id):
def _is_like_date(attr_name, attr_value, exist_id, instance_uuid):
if not isinstance(attr_value, decimal.Decimal):
raise WrongTypeException(attr_name, attr_value, exist_id)
raise WrongTypeException(attr_name, attr_value, exist_id, instance_uuid)
def _is_long(attr_name, attr_value, exist_id):
def _is_long(attr_name, attr_value, exist_id, instance_uuid):
if not isinstance(attr_value, long):
raise WrongTypeException(attr_name, attr_value, exist_id)
raise WrongTypeException(attr_name, attr_value, exist_id, instance_uuid)
def _is_int_in_char(attr_name, attr_value, exist_id):
def _is_int_in_char(attr_name, attr_value, exist_id, instance_uuid):
try:
int(attr_value)
except ValueError:
raise WrongTypeException(attr_name, attr_value, exist_id)
raise WrongTypeException(attr_name, attr_value, exist_id, instance_uuid)
def _is_hex_owner_id(attr_name, attr_value, exist_id):
def _is_hex_owner_id(attr_name, attr_value, exist_id, instance_uuid):
if not re.match("^[0-9a-fA-F]+$", attr_value):
raise WrongTypeException(attr_name, attr_value, exist_id)
raise WrongTypeException(attr_name, attr_value, exist_id, instance_uuid)
def _is_alphanumeric(attr_name, attr_value, exist_id):
def _is_alphanumeric(attr_name, attr_value, exist_id, instance_uuid):
if not re.match("[a-zA-Z0-9.]+$", attr_value):
raise WrongTypeException(attr_name, attr_value, exist_id)
raise WrongTypeException(attr_name, attr_value, exist_id, instance_uuid)
class Verifier(object):

@ -98,11 +98,11 @@ def validation_level():
def nova_event_type():
return config.get('nova_event_type', 'compute.instance.exists.verified.old')
return config.get('nova_event_type', 'compute.instance.exists.verified')
def glance_event_type():
return config.get('glance_event_type', 'image.exists.verified.old')
return config.get('glance_event_type', 'image.exists.verified')
def flavor_field_name():

@ -51,16 +51,14 @@ def _get_child_logger():
def _verify_field_mismatch(exists, usage):
if not base_verifier._verify_date_field(
usage.created_at, exists.created_at, same_second=True):
raise FieldMismatch('created_at', exists.created_at,
usage.created_at)
raise FieldMismatch('created_at', exists.created_at, usage.created_at,
exists.uuid)
if usage.owner != exists.owner:
raise FieldMismatch('owner', exists.owner,
usage.owner)
raise FieldMismatch('owner', exists.owner, usage.owner, exists.uuid)
if usage.size != exists.size:
raise FieldMismatch('size', exists.size,
usage.size)
raise FieldMismatch('size', exists.size, usage.size, exists.uuid)
def _verify_validity(exist):
@ -68,11 +66,12 @@ def _verify_validity(exist):
exist.uuid: 'uuid', exist.owner: 'owner'}
for (field_value, field_name) in fields.items():
if field_value is None:
raise NullFieldException(field_name, exist.id)
raise NullFieldException(field_name, exist.id, exist.uuid)
base_verifier._is_like_uuid('uuid', exist.uuid, exist.id)
base_verifier._is_like_date('created_at', exist.created_at, exist.id)
base_verifier._is_long('size', exist.size, exist.id)
base_verifier._is_hex_owner_id('owner', exist.owner, exist.id)
base_verifier._is_like_date('created_at', exist.created_at, exist.id,
exist.uuid)
base_verifier._is_long('size', exist.size, exist.id, exist.uuid)
base_verifier._is_hex_owner_id('owner', exist.owner, exist.id, exist.uuid)
def _verify_for_usage(exist, usage=None):
@ -124,7 +123,7 @@ def _verify_for_delete(exist, delete=None):
if not base_verifier._verify_date_field(
delete.deleted_at, exist.deleted_at, same_second=True):
raise FieldMismatch('deleted_at', exist.deleted_at,
delete.deleted_at)
delete.deleted_at, exist.uuid)
def _verify(exists):
@ -136,6 +135,9 @@ def _verify(exists):
_verify_validity(exist)
exist.mark_verified()
except VerificationException, e:
verified = False
exist.mark_failed(reason=str(e))
except Exception, e:
verified = False
exist.mark_failed(reason=e.__class__.__name__)
@ -148,20 +150,16 @@ class GlanceVerifier(Verifier):
def __init__(self, config, pool=None):
super(GlanceVerifier, self).__init__(config, pool=pool)
def verify_for_range(self, ending_max, callback=None):
exists_grouped_by_owner_and_rawid = \
models.ImageExists.find_and_group_by_owner_and_raw_id(
ending_max=ending_max,
status=models.ImageExists.PENDING)
count = len(exists_grouped_by_owner_and_rawid)
def verify_exists(self, grouped_exists, callback, verifying_status):
count = len(grouped_exists)
added = 0
update_interval = datetime.timedelta(seconds=30)
next_update = datetime.datetime.utcnow() + update_interval
_get_child_logger().info("glance: Adding %s per-owner exists to queue." % count)
while added < count:
for exists in exists_grouped_by_owner_and_rawid.values():
for exists in grouped_exists.values():
for exist in exists:
exist.status = models.ImageExists.VERIFYING
exist.status = verifying_status
exist.save()
result = self.pool.apply_async(_verify, args=(exists,),
callback=callback)
@ -174,6 +172,22 @@ class GlanceVerifier(Verifier):
next_update = datetime.datetime.utcnow() + update_interval
return count
def verify_for_range(self, ending_max, callback=None):
unsent_exists_grouped_by_owner_and_rawid = \
models.ImageExists.find_and_group_by_owner_and_raw_id(
ending_max=ending_max,
status=models.ImageExists.SENT_UNVERIFIED)
unsent_count = self.verify_exists(unsent_exists_grouped_by_owner_and_rawid,
None, models.ImageExists.SENT_VERIFYING)
exists_grouped_by_owner_and_rawid = \
models.ImageExists.find_and_group_by_owner_and_raw_id(
ending_max=ending_max,
status=models.ImageExists.PENDING)
count = self.verify_exists(exists_grouped_by_owner_and_rawid, callback,
models.ImageExists.VERIFYING)
return count+unsent_count
def send_verified_notification(self, exist, connection, exchange,
routing_keys=None):
# NOTE (apmelton)

@ -54,33 +54,34 @@ def _verify_field_mismatch(exists, launch):
if not base_verifier._verify_date_field(
launch.launched_at, exists.launched_at, same_second=True):
raise FieldMismatch('launched_at', exists.launched_at,
launch.launched_at)
launch.launched_at, exists.instance)
if getattr(launch, flavor_field_name) != \
getattr(exists, flavor_field_name):
raise FieldMismatch(flavor_field_name,
getattr(exists, flavor_field_name),
getattr(launch, flavor_field_name))
getattr(launch, flavor_field_name),
exists.instance)
if launch.tenant != exists.tenant:
raise FieldMismatch('tenant', exists.tenant,
launch.tenant)
raise FieldMismatch('tenant', exists.tenant, launch.tenant,
exists.instance)
if launch.rax_options != exists.rax_options:
raise FieldMismatch('rax_options', exists.rax_options,
launch.rax_options)
launch.rax_options, exists.instance)
if launch.os_architecture != exists.os_architecture:
raise FieldMismatch('os_architecture', exists.os_architecture,
launch.os_architecture)
launch.os_architecture, exists.instance)
if launch.os_version != exists.os_version:
raise FieldMismatch('os_version', exists.os_version,
launch.os_version)
launch.os_version, exists.instance)
if launch.os_distro != exists.os_distro:
raise FieldMismatch('os_distro', exists.os_distro,
launch.os_distro)
launch.os_distro, exists.instance)
def _verify_for_launch(exist, launch=None,
@ -147,12 +148,12 @@ def _verify_for_delete(exist, delete=None,
if not base_verifier._verify_date_field(
delete.launched_at, exist.launched_at, same_second=True):
raise FieldMismatch('launched_at', exist.launched_at,
delete.launched_at)
delete.launched_at, exist.instance)
if not base_verifier._verify_date_field(
delete.deleted_at, exist.deleted_at, same_second=True):
raise FieldMismatch(
'deleted_at', exist.deleted_at, delete.deleted_at)
raise FieldMismatch('deleted_at', exist.deleted_at,
delete.deleted_at, exist.instance)
def _verify_basic_validity(exist):
@ -164,11 +165,14 @@ def _verify_basic_validity(exist):
}
for (field_name, field_value) in fields.items():
if field_value is None:
raise NullFieldException(field_name, exist.id)
base_verifier._is_hex_owner_id('tenant', exist.tenant, exist.id)
base_verifier._is_like_date('launched_at', exist.launched_at, exist.id)
raise NullFieldException(field_name, exist.id, exist.instance)
base_verifier._is_hex_owner_id(
'tenant', exist.tenant, exist.id, exist.instance)
base_verifier._is_like_date(
'launched_at', exist.launched_at, exist.id, exist.instance)
if exist.deleted_at is not None:
base_verifier._is_like_date('deleted_at', exist.deleted_at, exist.id)
base_verifier._is_like_date(
'deleted_at', exist.deleted_at, exist.id, exist.instance)
def _verify_optional_validity(exist):
@ -178,11 +182,15 @@ def _verify_optional_validity(exist):
exist.os_distro: 'os_distro'}
for (field_value, field_name) in fields.items():
if field_value == '':
raise NullFieldException(field_name, exist.id)
base_verifier._is_int_in_char('rax_options', exist.rax_options, exist.id)
base_verifier._is_alphanumeric('os_architecture', exist.os_architecture, exist.id)
base_verifier._is_alphanumeric('os_distro', exist.os_distro, exist.id)
base_verifier._is_alphanumeric('os_version', exist.os_version, exist.id)
raise NullFieldException(field_name, exist.id, exist.instance)
base_verifier._is_int_in_char(
'rax_options', exist.rax_options, exist.id, exist.instance)
base_verifier._is_alphanumeric(
'os_architecture', exist.os_architecture, exist.id, exist.instance)
base_verifier._is_alphanumeric(
'os_distro', exist.os_distro, exist.id, exist.instance)
base_verifier._is_alphanumeric(
'os_version', exist.os_version, exist.id, exist.instance)
def _verify_validity(exist, validation_level):
@ -290,9 +298,7 @@ class NovaVerifier(base_verifier.Verifier):
message_service.send_notification(
json_body[1], key, connection, exchange)
def verify_for_range(self, ending_max, callback=None):
exists = models.InstanceExists.find(
ending_max=ending_max, status=models.InstanceExists.PENDING)
def verify_exists(self, callback, exists, verifying_status):
count = exists.count()
added = 0
update_interval = datetime.timedelta(seconds=30)
@ -300,7 +306,7 @@ class NovaVerifier(base_verifier.Verifier):
_get_child_logger().info("nova: Adding %s exists to queue." % count)
while added < count:
for exist in exists[0:1000]:
exist.update_status(models.InstanceExists.VERIFYING)
exist.update_status(verifying_status)
exist.save()
validation_level = self.config.validation_level()
result = self.pool.apply_async(
@ -315,6 +321,20 @@ class NovaVerifier(base_verifier.Verifier):
next_update = datetime.datetime.utcnow() + update_interval
return count
def verify_for_range(self, ending_max, callback=None):
sent_unverified_exists = models.InstanceExists.find(
ending_max=ending_max, status=
models.InstanceExists.SENT_UNVERIFIED)
sent_unverified_count = self.verify_exists(None,
sent_unverified_exists,
models.InstanceExists.
SENT_VERIFYING)
exists = models.InstanceExists.find(
ending_max=ending_max, status=models.InstanceExists.PENDING)
count = self.verify_exists(callback, exists,
models.InstanceExists.VERIFYING)
return count+sent_unverified_count
def reconcile_failed(self):
for failed_exist in self.failed:
self.reconciler.failed_validation(failed_exist)

@ -25,13 +25,13 @@ def _get_parent_logger():
def kill_time(signal, frame):
log_listener.end()
print "dying ..."
for process in processes:
process.terminate()
print "rose"
for process in processes:
process.join()
log_listener.end()
print "bud"
sys.exit(0)

@ -19,6 +19,7 @@
import datetime
import sys
import time
import signal
import kombu
import kombu.mixins
@ -40,6 +41,7 @@ from stacktach import stacklog
from stacktach import views
stacklog.set_default_logger_name('worker')
shutdown_soon = False
def _get_child_logger():
@ -60,6 +62,7 @@ class Consumer(kombu.mixins.ConsumerMixin):
self.total_processed = 0
self.topics = topics
self.exchange = exchange
signal.signal(signal.SIGTERM, self._shutdown)
def _create_exchange(self, name, type, exclusive=False, auto_delete=False):
return message_service.create_exchange(name, exchange_type=type,
@ -135,9 +138,14 @@ class Consumer(kombu.mixins.ConsumerMixin):
(e, json.loads(str(message.body))))
raise
def _shutdown(self, signal, stackframe = False):
global shutdown_soon
self.should_stop = True
shutdown_soon = True
def continue_running():
return True
return not shutdown_soon
def exit_or_sleep(exit=False):
@ -197,6 +205,10 @@ def run(deployment_config, deployment_id, exchange):
"exception=%s. Retrying in 5s"
logger.exception(msg % (name, exchange, e))
exit_or_sleep(exit_on_exception)
logger.info("Worker exiting.")
signal.signal(signal.SIGINT, signal.SIG_IGN)
signal.signal(signal.SIGTERM, signal.SIG_IGN)
POST_PROCESS_METHODS = {
'RawData': views.post_process_rawdata,