Retire stackforge/swiftsync
This commit is contained in:
parent
ea4c2c2903
commit
69e8c55a13
19
.gitignore
vendored
19
.gitignore
vendored
@ -1,19 +0,0 @@
|
||||
AUTHORS
|
||||
ChangeLog
|
||||
dist/
|
||||
.tox
|
||||
*.egg-info
|
||||
*.py[co]
|
||||
.DS_Store
|
||||
*.log
|
||||
.testrepository
|
||||
subunit.log
|
||||
build
|
||||
swiftclient/versioninfo
|
||||
.autogenerated
|
||||
.coverage
|
||||
cover/
|
||||
coverage.xml
|
||||
doc/source/api/
|
||||
.coverage*
|
||||
etc/config.ini
|
@ -1,4 +0,0 @@
|
||||
[gerrit]
|
||||
host=review.openstack.org
|
||||
port=29418
|
||||
project=stackforge/swiftsync.git
|
2
.mailmap
2
.mailmap
@ -1,2 +0,0 @@
|
||||
Fabien Boucher <fabien.boucher@enovance.com> Fabien Boucher <fabien.dot.boucher@gmail.com>
|
||||
Fabien Boucher <fabien.boucher@enovance.com> Fabien Boucher <fabien@DrZoidberg.(none)>
|
@ -1,4 +0,0 @@
|
||||
[DEFAULT]
|
||||
test_command=${PYTHON:-python} -m subunit.run discover -t ./ ./tests/units/ $LISTOPT $IDOPTION
|
||||
test_id_option=--load-list $IDFILE
|
||||
test_list_option=--list
|
211
README.md
211
README.md
@ -1,211 +0,0 @@
|
||||
A massive Swift syncer
|
||||
======================
|
||||
|
||||
The purpose of this tool is to give you a way to migrate
|
||||
the entire content of a swift cluster to an another by
|
||||
using swift REST API.
|
||||
|
||||
The swiftsync project come with two tools:
|
||||
|
||||
- swfiller
|
||||
- swsync
|
||||
|
||||
The first one will ease swsync testing by the way
|
||||
you will be able to populate a test cluster easily
|
||||
by filling it quickly with heterogeneous data.
|
||||
|
||||
The second is the syncer. Basically it will read
|
||||
origin swift cluster account by account and
|
||||
perform the data synchronization by taking care
|
||||
of avoiding synchronization for data already up to
|
||||
date.
|
||||
|
||||
Run unit tests
|
||||
--------------
|
||||
|
||||
Unitests can be run quickly just after cloning
|
||||
the project from github.
|
||||
|
||||
$ sudo pip install tox
|
||||
$ cd swiftsync
|
||||
$ tox
|
||||
|
||||
Run functional tests
|
||||
--------------------
|
||||
|
||||
You can easily start functional tests for swsync
|
||||
tool by installing two swift on devstack and setting
|
||||
the ResellerAdmin role to the admin user in keystone
|
||||
(refer to swsync usage later in this readme) and start
|
||||
nose as follow :
|
||||
|
||||
$ nosetests -v --nologcapture tests/functional/test_syncer.py
|
||||
|
||||
swiftsync installation
|
||||
----------------------
|
||||
|
||||
Prepare a python virtual environment and start
|
||||
setup.py install.
|
||||
|
||||
$ virtualenv $HOME/venv
|
||||
$ . $HOME/venv/bin/activate
|
||||
$ pip install -r tools/pip-requires
|
||||
$ python setup.py install
|
||||
|
||||
Note, without the manual pip install, the installation might failed with
|
||||
this error: 'TypeError: dist must be a Distribution instance'
|
||||
ref: https://bugs.launchpad.net/swift/+bug/1217288
|
||||
|
||||
swfiller usage
|
||||
--------------
|
||||
|
||||
This script aims to fill in a swift cluster with random
|
||||
data. A custom amount of account will be created against keystone
|
||||
then many containers and objects will be pushed to those accounts.
|
||||
Accounts and objects will be flavored with some random meta data.
|
||||
|
||||
Two indexes will be pickled to filesystem to store first
|
||||
which accounts has been created and second
|
||||
which containers/objects + MD5 and metadata
|
||||
has been stored.
|
||||
|
||||
This script use eventlet to try to speedup the most
|
||||
the fill in process. Default concurrency value can be modified
|
||||
in configuration file.
|
||||
|
||||
Before using the filler you need to add a configuration file
|
||||
by copying the sample one (etc/config-sample.ini) and then
|
||||
editing keystone_origin address and keystone_origin_admin_credential
|
||||
(tenant:username:password). Be sure to use an user with keystone
|
||||
admin role to let the filler create tenants and users.
|
||||
|
||||
Which kind of randomization the filler will add to data:
|
||||
|
||||
* random account name
|
||||
* random metadata on account (some will contain unicode)
|
||||
* random container name
|
||||
* random metadata on container (some will contain unicode)
|
||||
* random object name with random garbage data in it
|
||||
* random metadata on object (some will contain unicode)
|
||||
* some object will be created with empty content
|
||||
|
||||
The command below will fill in the swift cluster:
|
||||
|
||||
$ swfiller --create -a 10 -u 1 -c 10 -f 10 -s 1024 --config etc/config.ini
|
||||
|
||||
Meaning of the options are as follow:
|
||||
|
||||
* --create : creating mode (there also a deletion mode to clean tenant and data)
|
||||
* -a : amount of account or tenant to create
|
||||
* -u : amount of user to create or each account
|
||||
* -c : amount of container to create for each account
|
||||
* -f : amount of file to create in each container
|
||||
* -s : the maximum size for file to create (in Bytes)
|
||||
|
||||
As mention above there is also a deletion mode that use
|
||||
index files created during fill in operations. Index files
|
||||
will keep a list a user and tenant we have created in keystone.
|
||||
So to clean all account and data the create mode has created
|
||||
use the deletion mode:
|
||||
|
||||
$ swfiller --delete --config etc/config.ini
|
||||
|
||||
swsync usage
|
||||
------------
|
||||
|
||||
The synchronization process will not handle keystone synchronization.
|
||||
Database synchronization will need to be done by configuring
|
||||
the replication capabilities of the keystone database.
|
||||
|
||||
The user used by the sync tool will need to be able to perform
|
||||
API operations on each account for both origin and destination
|
||||
cluster. To do that the user must own the role ResellerAdmin.
|
||||
|
||||
Adding the role ResellerAdmin to admin user in keystone is
|
||||
straightforward by using the following command (be sure
|
||||
to have properly set your environment variables before):
|
||||
|
||||
$ keystone user-role-add --tenant admin --user \
|
||||
admin --role ResellerAdmin
|
||||
|
||||
swsync will replicate :
|
||||
|
||||
* account and account metadata
|
||||
* container and container metadata
|
||||
* object and object metadata
|
||||
|
||||
The way it will act to do that is as follow:
|
||||
|
||||
* will synchronize account metadata if they has changed on origin
|
||||
* will delete container on destination if no longer exists on origin
|
||||
* will create container on destination if not exists
|
||||
* will synchronize destination container metadata if not same
|
||||
as origin container.
|
||||
* will remove container object if no longer exists in origin container
|
||||
* will synchronize object and metadata object if the last-modified header
|
||||
is the lastest on the origin.
|
||||
|
||||
|
||||
To start the synchronization process you need to edit
|
||||
the configuration file and configure keystone_dest
|
||||
and keystone_dest_credentials. Then to start
|
||||
the process simply :
|
||||
|
||||
$ swsync etc/config.ini
|
||||
|
||||
As mention above the sync process won't
|
||||
replicate origin keystone accounts to the destination
|
||||
keystone so swift accounts on destination will
|
||||
not work until you start a keystone database synchronization. But be sure
|
||||
when performing the database synchronization to have swift endpoints
|
||||
configured to reference the destination swift.
|
||||
|
||||
swsync will take care of already synchronized containers/objects. When
|
||||
re-starting swsync it will only synchronize data that have changed.
|
||||
swsync has been designed to be run and run again and not ensuring that the
|
||||
first pass goes well, if for example there is network failure swsync will
|
||||
just skip it and hope to do it on the next run. So the tool can for instance
|
||||
be launched by a cron job to perform diff synchronization each night.
|
||||
|
||||
Tenant Filter File
|
||||
------------------
|
||||
|
||||
It is possible to limit the migration to a subset of the total number of
|
||||
tenants, by uncommenting the field "tenant_filter_file". This field should
|
||||
hold the path to a file containing a list of tenant names to migrate, one
|
||||
per line. If left commented, swsync will migrate all the tenants.
|
||||
|
||||
Swift Middleware last-modified
|
||||
------------------------------
|
||||
|
||||
A swift middleware has been written to speedup the
|
||||
synchronization process by adding a last modified metadata
|
||||
to container header. The idea behind this is to only
|
||||
process the container whether the timestamp is greater
|
||||
on origin avoiding uselessly walking through container.
|
||||
When performing some tests we figured out that synchronization
|
||||
performances was fast enough for our use case so we decided
|
||||
to not support this metadata in swsync for now. But If you want to
|
||||
contribute feel free to add it !
|
||||
|
||||
|
||||
Things to considers
|
||||
-------------------
|
||||
|
||||
swfiller and swsync are not designed to work with swift v1.0 authentication.
|
||||
We experienced some performances troubles when doing large synchronization
|
||||
with token validation. Having to validate the token each time could come back with
|
||||
error due to keystone capability to handle large amount of token validation requests.
|
||||
|
||||
|
||||
Reporting a bug
|
||||
---------------
|
||||
|
||||
The issue tracker is managed by launchpad so please use the
|
||||
following link to report a bug :
|
||||
|
||||
https://bugs.launchpad.net/swiftsync
|
||||
|
||||
If you want to submit a patch please use https://review.openstack.org.
|
||||
If you are not familiar with the Openstack way of submitting patches
|
||||
please read before https://wiki.openstack.org/wiki/How_To_Contribute.
|
7
README.rst
Normal file
7
README.rst
Normal file
@ -0,0 +1,7 @@
|
||||
This project is no longer maintained.
|
||||
|
||||
The contents of this repository are still available in the Git source code
|
||||
management system. To see the contents of this repository before it reached
|
||||
its end of life, please check out the previous commit with
|
||||
"git checkout HEAD^1".
|
||||
|
143
bin/swfiller
143
bin/swfiller
@ -1,143 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
# -*- encoding: utf-8 -*-
|
||||
import argparse
|
||||
import logging
|
||||
import os
|
||||
import pickle
|
||||
import sys
|
||||
|
||||
import eventlet
|
||||
from keystoneclient.v2_0 import client as ksclient
|
||||
|
||||
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "..")))
|
||||
from swsync import filler
|
||||
from swsync import utils
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(prog='swift-filler',
|
||||
add_help=True)
|
||||
parser.add_argument('--delete',
|
||||
action='store_true',
|
||||
help='Suppress created accounts/users')
|
||||
parser.add_argument('--create',
|
||||
action='store_true',
|
||||
help='Create account/users/containers/data')
|
||||
parser.add_argument('-l',
|
||||
action='store_true',
|
||||
help='Load previous indexes and append newly'
|
||||
' created to it')
|
||||
parser.add_argument('-a',
|
||||
help='Specify account amount')
|
||||
parser.add_argument('-u',
|
||||
help='Specify user amount by account')
|
||||
parser.add_argument('-c',
|
||||
help='Specify container amount by account')
|
||||
parser.add_argument('-f',
|
||||
help='Specify file amount by account')
|
||||
parser.add_argument('-s',
|
||||
help='Specify the MAX file size. Files '
|
||||
'will be from 1024 Bytes to MAX Bytes')
|
||||
parser.add_argument('-d', '--log-level',
|
||||
dest='log_level',
|
||||
default='info',
|
||||
help='Specify the log level')
|
||||
parser.add_argument('--config',
|
||||
dest='config',
|
||||
help='Optional configuration file path')
|
||||
args = parser.parse_args()
|
||||
|
||||
utils.set_logging(args.log_level)
|
||||
|
||||
if args.config and os.path.isfile(args.config):
|
||||
try:
|
||||
conf = utils.parse_ini(args.config)
|
||||
except Exception, exc:
|
||||
logging.info('Unable to parse provided conf file')
|
||||
logging.error(exc)
|
||||
sys.exit(1)
|
||||
else:
|
||||
try:
|
||||
conf = utils.parse_ini()
|
||||
except(utils.ConfigurationError):
|
||||
parser.print_help()
|
||||
sys.exit(1)
|
||||
|
||||
utils.CONFIG = conf
|
||||
|
||||
if not args.create and not args.delete:
|
||||
parser.print_help()
|
||||
sys.exit(1)
|
||||
if args.create and args.delete:
|
||||
parser.print_help()
|
||||
sys.exit(1)
|
||||
|
||||
sw_c_concu = int(utils.get_config('concurrency',
|
||||
'filler_swift_client_concurrency'))
|
||||
ks_c_concu = int(utils.get_config('concurrency',
|
||||
'filler_keystone_client_concurrency'))
|
||||
pile = eventlet.GreenPile(sw_c_concu)
|
||||
pool = eventlet.GreenPool(ks_c_concu)
|
||||
|
||||
_config = utils.get_config('auth',
|
||||
'keystone_origin_admin_credentials').split(':')
|
||||
tenant_name, username, password = _config
|
||||
client = ksclient.Client(
|
||||
auth_url=utils.get_config('auth', 'keystone_origin'),
|
||||
username=username,
|
||||
password=password,
|
||||
tenant_name=tenant_name)
|
||||
|
||||
index_path = utils.get_config('filler', 'index_path')
|
||||
index_containers_path = utils.get_config('filler', 'index_containers_path')
|
||||
|
||||
if args.l:
|
||||
index = filler.load_index()
|
||||
index_containers = filler.load_containers_index()
|
||||
else:
|
||||
index = {}
|
||||
index_containers = {}
|
||||
if args.create:
|
||||
if args.a is None or not args.a.isdigit():
|
||||
logging.info("Provide account amount by setting '-a' option")
|
||||
sys.exit(1)
|
||||
if args.u is None or not args.u.isdigit():
|
||||
logging.info("Provide user by account "
|
||||
"amount by setting '-u' option")
|
||||
sys.exit(1)
|
||||
if args.s is None:
|
||||
fmax = 1024
|
||||
else:
|
||||
if args.s.isdigit():
|
||||
fmax = max(1024, int(args.s))
|
||||
else:
|
||||
fmax = 1024
|
||||
created = filler.create_swift_account(client, pile,
|
||||
int(args.a),
|
||||
int(args.u), index=index)
|
||||
if args.f is not None and args.c is not None:
|
||||
if args.f.isdigit() and args.c.isdigit():
|
||||
filler.fill_swift(pool, created, int(args.c),
|
||||
int(args.f), fmax,
|
||||
index_containers=index_containers)
|
||||
else:
|
||||
logging.info("'-c' and '-f' options must be integers")
|
||||
sys.exit(1)
|
||||
pickle.dump(index, open(index_path, 'w'))
|
||||
pickle.dump(index_containers, open(index_containers_path, 'w'))
|
||||
if args.delete:
|
||||
index = filler.load_index()
|
||||
for k, v in index.items():
|
||||
user_info_list = [user[1] for user in v]
|
||||
# Take the first user we find
|
||||
filler.delete_account_content(k, v[0])
|
||||
filler.delete_account(client, user_info_list, k)
|
||||
del index[k]
|
||||
if not os.path.exists(index_path):
|
||||
logging.info("No index_path to load.")
|
||||
sys.exit(1)
|
||||
pickle.dump(index, open(index_path, 'w'))
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
54
bin/swsync
54
bin/swsync
@ -1,54 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import optparse
|
||||
import sys
|
||||
|
||||
import swsync.accounts
|
||||
import swsync.utils
|
||||
|
||||
|
||||
class Main(object):
|
||||
def __init__(self):
|
||||
self.options = {}
|
||||
|
||||
def main(self):
|
||||
usage = "usage: %prog [OPTIONS] [CONF_FILE]"
|
||||
parser = optparse.OptionParser(usage=usage)
|
||||
parser.add_option(
|
||||
'-l', '--log-level',
|
||||
dest='log_level',
|
||||
default='info',
|
||||
help='Number of containers to distribute objects among')
|
||||
self.options, args = parser.parse_args()
|
||||
if args:
|
||||
conf = swsync.utils.parse_ini(args[0])
|
||||
else:
|
||||
try:
|
||||
conf = swsync.utils.parse_ini()
|
||||
except(swsync.utils.ConfigurationError):
|
||||
parser.print_help()
|
||||
sys.exit(1)
|
||||
|
||||
swsync.utils.set_logging(self.options.log_level.lower())
|
||||
#beurk
|
||||
swsync.utils.CONFIG = conf
|
||||
swsync.accounts.main()
|
||||
|
||||
if __name__ == '__main__':
|
||||
m = Main()
|
||||
m.main()
|
@ -1,28 +0,0 @@
|
||||
[auth]
|
||||
keystone_origin = http://vm:5000/v2.0
|
||||
keystone_origin_admin_credentials = admin:admin:ADMIN
|
||||
|
||||
keystone_dest = http://vm2:5000/v2.0
|
||||
|
||||
keystone_origin_demo_credentials = demo:demo:ADMIN
|
||||
keystone_dest_credentials = demo:demo:ADMIN
|
||||
|
||||
[filler]
|
||||
swift_operator_role = Member
|
||||
|
||||
default_user_password = password
|
||||
default_user_email = johndoe@domain.com
|
||||
|
||||
index_path = /tmp/swift_filler_index.pkl
|
||||
index_containers_path = /tmp/swift_filler_containers_index.pkl
|
||||
|
||||
[concurrency]
|
||||
filler_keystone_client_concurrency = 5
|
||||
filler_swift_client_concurrency = 10
|
||||
# This is usually bound to the max open files.
|
||||
sync_swift_client_concurrency = 10
|
||||
|
||||
[sync]
|
||||
# Uncomment this field to designate a file containing a list of tenant names
|
||||
# to be migrated. If left commented, all the tenants will be targeted.
|
||||
# tenant_filter_file = etc/tenants.list
|
@ -1,85 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Fabien Boucher <fabien.boucher@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import time
|
||||
|
||||
from swift.common import swob
|
||||
from swift.common import utils
|
||||
from swift.common import wsgi
|
||||
|
||||
|
||||
class LastModifiedMiddleware(object):
|
||||
"""This middleware update container Last Modified meta
|
||||
|
||||
LastModified is a middleware that add a meta to a container
|
||||
when that container and/or objects in it are modified. The meta
|
||||
data will contains the epoch timestamp. This middleware aims
|
||||
to be used with the synchronizer. It limits the tree parsing
|
||||
by giving a way to know a container has been modified since the
|
||||
last container synchronization.
|
||||
|
||||
Actions that lead to the container meta modification :
|
||||
- POST/PUT on container
|
||||
- POST/PUT/DELETE on object in it
|
||||
|
||||
The following shows an example of proxy-server.conf:
|
||||
[pipeline:main]
|
||||
pipeline = catch_errors cache tempauth last-modified proxy-server
|
||||
|
||||
[filter:last-modified]
|
||||
use = egg:swift#last_modified
|
||||
# will show as X-Container-Meta-${key_name} for the container's header.
|
||||
key_name = Last-Modified
|
||||
"""
|
||||
|
||||
def __init__(self, app, conf):
|
||||
self.app = app
|
||||
self.conf = conf
|
||||
self.logger = utils.get_logger(self.conf, log_route='last_modified')
|
||||
self.key_name = conf.get('key_name',
|
||||
'Last-Modified').strip().replace(' ', '-')
|
||||
|
||||
def update_last_modified_meta(self, req, env):
|
||||
vrs, account, container, obj = req.split_path(1, 4, True)
|
||||
path = env['PATH_INFO']
|
||||
if obj:
|
||||
path = path.split('/%s' % obj)[0]
|
||||
metakey = 'X-Container-Meta-%s' % self.key_name
|
||||
headers = {metakey: str(time.time())}
|
||||
set_meta_req = wsgi.make_pre_authed_request(env,
|
||||
method='POST',
|
||||
path=path,
|
||||
headers=headers,
|
||||
swift_source='lm')
|
||||
set_meta_req.get_response(self.app)
|
||||
|
||||
@swob.wsgify
|
||||
def __call__(self, req):
|
||||
vrs, account, container, obj = req.split_path(1, 4, True)
|
||||
if (req.method in ('POST', 'PUT') and
|
||||
container or req.method == 'DELETE' and obj):
|
||||
new_env = req.environ.copy()
|
||||
self.update_last_modified_meta(req, new_env)
|
||||
return self.app
|
||||
|
||||
|
||||
def filter_factory(global_conf, **local_conf):
|
||||
conf = global_conf.copy()
|
||||
conf.update(local_conf)
|
||||
|
||||
return lambda app: LastModifiedMiddleware(app, conf)
|
@ -1,7 +0,0 @@
|
||||
[DEFAULT]
|
||||
|
||||
# The list of modules to copy from openstack-common
|
||||
modules=setup
|
||||
|
||||
# The base module to hold the copy of openstack.common
|
||||
base=swsync
|
59
setup.py
59
setup.py
@ -1,59 +0,0 @@
|
||||
#!/usr/bin/python
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import setuptools
|
||||
|
||||
from swsync.openstack.common import setup
|
||||
|
||||
name = 'swsync'
|
||||
|
||||
requires = setup.parse_requirements()
|
||||
depend_links = setup.parse_dependency_links()
|
||||
entry_point = '%s.middlewares:last_modified' % (name)
|
||||
|
||||
setuptools.setup(
|
||||
name=name,
|
||||
version=setup.get_version(name),
|
||||
description='A massive swift syncer',
|
||||
url='https://github.com/enovance/swsync',
|
||||
license='Apache License (2.0)',
|
||||
author='eNovance SAS.',
|
||||
author_email='dev@enovance.com',
|
||||
packages=setuptools.find_packages(exclude=['tests', 'tests.*']),
|
||||
cmdclass=setup.get_cmdclass(),
|
||||
install_requires=requires,
|
||||
dependency_links=depend_links,
|
||||
classifiers=[
|
||||
'Development Status :: 4 - Beta',
|
||||
'Environment :: Console',
|
||||
'Environment :: OpenStack',
|
||||
'Intended Audience :: Developers',
|
||||
'Intended Audience :: Information Technology',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Operating System :: OS Independent',
|
||||
'Programming Language :: Python :: 2.6',
|
||||
'Environment :: No Input/Output (Daemon)',
|
||||
],
|
||||
scripts=[
|
||||
'bin/swfiller',
|
||||
'bin/swsync',
|
||||
],
|
||||
entry_points={
|
||||
'paste.filter_factory': ['last_modified=%s' % entry_point]
|
||||
}
|
||||
)
|
@ -1 +0,0 @@
|
||||
# -*- encoding: utf-8 -*-
|
@ -1,200 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import datetime
|
||||
import logging
|
||||
import os
|
||||
import time
|
||||
|
||||
import dateutil.relativedelta
|
||||
import keystoneclient.v2_0.client
|
||||
import swiftclient
|
||||
|
||||
import swsync.containers
|
||||
from utils import ConfigurationError
|
||||
from utils import get_config
|
||||
|
||||
|
||||
class Accounts(object):
|
||||
"""Process Keystone Accounts."""
|
||||
def __init__(self):
|
||||
self.keystone_cnx = None
|
||||
self.container_cls = swsync.containers.Containers()
|
||||
|
||||
def get_swift_auth(self, auth_url, tenant, user, password):
|
||||
"""Get swift connexion from args."""
|
||||
return swiftclient.client.Connection(
|
||||
auth_url,
|
||||
'%s:%s' % (tenant, user),
|
||||
password,
|
||||
auth_version=2).get_auth()
|
||||
|
||||
def get_ks_auth_orig(self):
|
||||
"""Get keystone cnx from config."""
|
||||
orig_auth_url = get_config('auth', 'keystone_origin')
|
||||
cfg = get_config('auth', 'keystone_origin_admin_credentials')
|
||||
(tenant_name, username, password) = cfg.split(':')
|
||||
|
||||
return keystoneclient.v2_0.client.Client(auth_url=orig_auth_url,
|
||||
username=username,
|
||||
password=password,
|
||||
tenant_name=tenant_name)
|
||||
|
||||
def get_target_tenant_filter(self):
|
||||
"""Returns a set of target tenants from the tenant_list_file.
|
||||
tenant_list_file is defined in the config file or given as a command
|
||||
line argument.
|
||||
|
||||
If tenant_list_file is not defined, returns None (an empty filter).
|
||||
"""
|
||||
try:
|
||||
tenant_filter_filename = get_config('sync', 'tenant_filter_file')
|
||||
|
||||
with open(tenant_filter_filename) as tenantsfile:
|
||||
return {name.strip() for name in tenantsfile.readlines()}
|
||||
except ConfigurationError:
|
||||
return None
|
||||
|
||||
def account_headers_clean(self, account_headers, to_null=False):
|
||||
ret = {}
|
||||
for key, value in account_headers.iteritems():
|
||||
if key.startswith('x-account-meta'):
|
||||
if to_null:
|
||||
value = ''
|
||||
ret[key] = value
|
||||
return ret
|
||||
|
||||
def sync_account(self, orig_storage_url, orig_token,
|
||||
dest_storage_url, dest_token):
|
||||
"""Sync a single account with url/tok to dest_url/dest_tok."""
|
||||
orig_storage_cnx = swiftclient.http_connection(orig_storage_url)
|
||||
dest_storage_cnx = swiftclient.http_connection(dest_storage_url)
|
||||
account_id = os.path.basename(orig_storage_url.replace("AUTH_", ''))
|
||||
|
||||
try:
|
||||
orig_account_headers, orig_containers = (
|
||||
swiftclient.get_account(None, orig_token,
|
||||
http_conn=orig_storage_cnx,
|
||||
full_listing=True))
|
||||
|
||||
dest_account_headers, dest_containers = (
|
||||
swiftclient.get_account(None, dest_token,
|
||||
http_conn=dest_storage_cnx,
|
||||
full_listing=True))
|
||||
except(swiftclient.client.ClientException), e:
|
||||
logging.info("error getting account: %s, %s" % (
|
||||
account_id, e.http_reason))
|
||||
return
|
||||
|
||||
self.container_cls.delete_container(dest_storage_cnx,
|
||||
dest_token,
|
||||
orig_containers,
|
||||
dest_containers)
|
||||
|
||||
do_headers = False
|
||||
if len(dest_account_headers) != len(orig_account_headers):
|
||||
do_headers = True
|
||||
else:
|
||||
for k, v in orig_account_headers.iteritems():
|
||||
if not k.startswith('x-account-meta'):
|
||||
continue
|
||||
if k not in dest_account_headers:
|
||||
do_headers = True
|
||||
elif dest_account_headers[k] != v:
|
||||
do_headers = True
|
||||
|
||||
if do_headers:
|
||||
orig_metadata_headers = self.account_headers_clean(
|
||||
orig_account_headers)
|
||||
dest_metadata_headers = self.account_headers_clean(
|
||||
dest_account_headers, to_null=True)
|
||||
|
||||
new_headers = dict(dest_metadata_headers.items() +
|
||||
orig_metadata_headers.items())
|
||||
try:
|
||||
swiftclient.post_account(
|
||||
"", dest_token, new_headers,
|
||||
http_conn=dest_storage_cnx,
|
||||
)
|
||||
logging.info("HEADER: sync headers: %s" % (account_id))
|
||||
except(swiftclient.client.ClientException), e:
|
||||
logging.info("ERROR: updating container metadata: %s, %s" % (
|
||||
account_id, e.http_reason))
|
||||
# We don't pass on because since the server was busy
|
||||
# let's pass it on for the next pass
|
||||
return
|
||||
|
||||
for container in orig_containers:
|
||||
logging.info("Syncronizing container %s: %s",
|
||||
container['name'], container)
|
||||
dt1 = datetime.datetime.fromtimestamp(time.time())
|
||||
self.container_cls.sync(orig_storage_cnx,
|
||||
orig_storage_url,
|
||||
orig_token,
|
||||
dest_storage_cnx,
|
||||
dest_storage_url, dest_token,
|
||||
container['name'])
|
||||
|
||||
dt2 = datetime.datetime.fromtimestamp(time.time())
|
||||
rd = dateutil.relativedelta.relativedelta(dt2, dt1)
|
||||
#TODO(chmou): use logging
|
||||
logging.info("%s done: %d hours, %d minutes and %d seconds",
|
||||
container['name'],
|
||||
rd.hours,
|
||||
rd.minutes, rd.seconds)
|
||||
|
||||
def process(self):
|
||||
"""Process all keystone accounts to sync."""
|
||||
orig_auth_url = get_config('auth', 'keystone_origin')
|
||||
orig_admin_tenant, orig_admin_user, orig_admin_password = (
|
||||
get_config('auth', 'keystone_origin_admin_credentials').split(':'))
|
||||
oa_st_url, orig_admin_token = self.get_swift_auth(
|
||||
orig_auth_url, orig_admin_tenant,
|
||||
orig_admin_user, orig_admin_password)
|
||||
dest_auth_url = get_config('auth', 'keystone_dest')
|
||||
|
||||
# we assume orig and dest passwd are the same obv synchronized.
|
||||
dst_st_url, dest_admin_token = self.get_swift_auth(
|
||||
dest_auth_url, orig_admin_tenant,
|
||||
orig_admin_user, orig_admin_password)
|
||||
|
||||
bare_oa_st_url = oa_st_url[:oa_st_url.find('AUTH_')] + "AUTH_"
|
||||
bare_dst_st_url = dst_st_url[:dst_st_url.find('AUTH_')] + "AUTH_"
|
||||
|
||||
self.keystone_cnx = self.get_ks_auth_orig()
|
||||
|
||||
# if user has defined target tenants, limit the migration
|
||||
# to them
|
||||
_targets_filters = self.get_target_tenant_filter()
|
||||
if _targets_filters is not None:
|
||||
_targets = (tenant for tenant in self.keystone_cnx.tenants.list()
|
||||
if tenant.name in _targets_filters)
|
||||
else:
|
||||
_targets = self.keystone_cnx.tenants.list()
|
||||
|
||||
for tenant in _targets:
|
||||
user_orig_st_url = bare_oa_st_url + tenant.id
|
||||
user_dst_st_url = bare_dst_st_url + tenant.id
|
||||
|
||||
self.sync_account(user_orig_st_url,
|
||||
orig_admin_token,
|
||||
user_dst_st_url,
|
||||
dest_admin_token)
|
||||
|
||||
|
||||
def main():
|
||||
acc = Accounts()
|
||||
acc.process()
|
@ -1,199 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import logging
|
||||
|
||||
import eventlet
|
||||
import swiftclient
|
||||
|
||||
import swsync.objects
|
||||
import swsync.utils
|
||||
|
||||
|
||||
class Containers(object):
|
||||
"""Containers sync."""
|
||||
def __init__(self):
|
||||
self.concurrency = int(swsync.utils.get_config(
|
||||
"concurrency",
|
||||
"sync_swift_client_concurrency"))
|
||||
self.sync_object = swsync.objects.sync_object
|
||||
self.delete_object = swsync.objects.delete_object
|
||||
|
||||
def delete_container(self, dest_storage_cnx, dest_token,
|
||||
orig_containers,
|
||||
dest_containers):
|
||||
set1 = set((x['name']) for x in orig_containers)
|
||||
set2 = set((x['name']) for x in dest_containers)
|
||||
delete_diff = set2 - set1
|
||||
|
||||
pool = eventlet.GreenPool(size=self.concurrency)
|
||||
pile = eventlet.GreenPile(pool)
|
||||
for container in delete_diff:
|
||||
try:
|
||||
dest_container_stats, dest_objects = swiftclient.get_container(
|
||||
None, dest_token, container, http_conn=dest_storage_cnx,
|
||||
full_listing=True,
|
||||
)
|
||||
except(swiftclient.client.ClientException), e:
|
||||
logging.info("error getting container: %s, %s" % (
|
||||
container, e.http_reason))
|
||||
continue
|
||||
|
||||
for obj in dest_objects:
|
||||
logging.info("deleting obj: %s ts:%s", obj['name'],
|
||||
obj['last_modified'])
|
||||
pile.spawn(self.delete_object,
|
||||
dest_storage_cnx,
|
||||
dest_token,
|
||||
container,
|
||||
obj['name'])
|
||||
pool.waitall()
|
||||
logging.info("deleting container: %s", container)
|
||||
pile.spawn(swiftclient.delete_container,
|
||||
'', dest_token, container, http_conn=dest_storage_cnx)
|
||||
pool.waitall()
|
||||
|
||||
def container_headers_clean(self, container_headers, to_null=False):
|
||||
ret = {}
|
||||
for key, value in container_headers.iteritems():
|
||||
if key.startswith('x-container-meta'):
|
||||
if to_null:
|
||||
value = ''
|
||||
ret[key] = value
|
||||
return ret
|
||||
|
||||
def sync(self, orig_storage_cnx, orig_storage_url,
|
||||
orig_token, dest_storage_cnx, dest_storage_url, dest_token,
|
||||
container_name):
|
||||
|
||||
try:
|
||||
orig_container_headers, orig_objects = swiftclient.get_container(
|
||||
None, orig_token, container_name, http_conn=orig_storage_cnx,
|
||||
full_listing=True,
|
||||
)
|
||||
except(swiftclient.client.ClientException), e:
|
||||
logging.info("ERROR: getting container: %s, %s" % (
|
||||
container_name, e.http_reason))
|
||||
return
|
||||
|
||||
try:
|
||||
# Check that the container exists on dest
|
||||
swiftclient.head_container(
|
||||
"", dest_token, container_name, http_conn=dest_storage_cnx
|
||||
)
|
||||
except(swiftclient.client.ClientException), e:
|
||||
container_headers = orig_container_headers.copy()
|
||||
for h in ('x-container-object-count', 'x-trans-id',
|
||||
'x-container-bytes-used'):
|
||||
try:
|
||||
del container_headers[h]
|
||||
except KeyError:
|
||||
# Nov2013: swift server does not set x-trans-id header
|
||||
pass
|
||||
p = dest_storage_cnx[0]
|
||||
url = "%s://%s%s" % (p.scheme, p.netloc, p.path)
|
||||
try:
|
||||
swiftclient.put_container(url,
|
||||
dest_token, container_name,
|
||||
headers=container_headers)
|
||||
except(swiftclient.client.ClientException), e:
|
||||
logging.info("ERROR: creating container: %s, %s" % (
|
||||
container_name, e.http_reason))
|
||||
return
|
||||
|
||||
try:
|
||||
dest_container_headers, dest_objects = swiftclient.get_container(
|
||||
None, dest_token, container_name, http_conn=dest_storage_cnx,
|
||||
full_listing=True,
|
||||
)
|
||||
except(swiftclient.client.ClientException), e:
|
||||
logging.info("ERROR: creating container: %s, %s" % (
|
||||
container_name, e.http_reason))
|
||||
return
|
||||
|
||||
try:
|
||||
header_key = 'x-container-meta-last-modified'
|
||||
orig_ts = float(orig_container_headers[header_key])
|
||||
dest_ts = float(dest_container_headers[header_key])
|
||||
if orig_ts < dest_ts:
|
||||
logging.info("Dest is up-to-date")
|
||||
return
|
||||
except(KeyError):
|
||||
# last-modified swift middleware is not active
|
||||
pass
|
||||
except(ValueError):
|
||||
logging.error("Could not decode last-modified header!")
|
||||
|
||||
do_headers = False
|
||||
if len(dest_container_headers) != len(orig_container_headers):
|
||||
do_headers = True
|
||||
else:
|
||||
for k, v in orig_container_headers.iteritems():
|
||||
if (k.startswith('x-container-meta') and
|
||||
k in dest_container_headers):
|
||||
if dest_container_headers[k] != v:
|
||||
do_headers = True
|
||||
|
||||
if do_headers:
|
||||
orig_metadata_headers = self.container_headers_clean(
|
||||
orig_container_headers)
|
||||
dest_metadata_headers = self.container_headers_clean(
|
||||
dest_container_headers, to_null=True)
|
||||
new_headers = dict(dest_metadata_headers.items() +
|
||||
orig_metadata_headers.items())
|
||||
try:
|
||||
swiftclient.post_container(
|
||||
"", dest_token, container_name, new_headers,
|
||||
http_conn=dest_storage_cnx,
|
||||
)
|
||||
logging.info("HEADER: sync headers: %s" % (container_name))
|
||||
except(swiftclient.client.ClientException), e:
|
||||
logging.info("ERROR: updating container metadata: %s, %s" % (
|
||||
container_name, e.http_reason))
|
||||
# We don't pass on because since the server was busy
|
||||
# let's pass it on for the next pass
|
||||
return
|
||||
|
||||
set1 = set((x['last_modified'], x['name']) for x in orig_objects)
|
||||
set2 = set((x['last_modified'], x['name']) for x in dest_objects)
|
||||
diff = set1 - set2
|
||||
set1 = set(x['name'] for x in orig_objects)
|
||||
set2 = set(x['name'] for x in dest_objects)
|
||||
delete_diff = set2 - set1
|
||||
|
||||
if not diff and not delete_diff:
|
||||
return
|
||||
|
||||
pool = eventlet.GreenPool(size=self.concurrency)
|
||||
pile = eventlet.GreenPile(pool)
|
||||
|
||||
for obj in diff:
|
||||
logging.info("sending: %s ts:%s", obj[1], obj[0])
|
||||
pile.spawn(self.sync_object,
|
||||
orig_storage_url,
|
||||
orig_token,
|
||||
dest_storage_url,
|
||||
dest_token, container_name,
|
||||
obj)
|
||||
|
||||
for obj in delete_diff:
|
||||
logging.info("deleting: %s ts:%s", obj[1], obj[0])
|
||||
pile.spawn(self.delete_object,
|
||||
dest_storage_cnx,
|
||||
dest_token,
|
||||
container_name,
|
||||
obj)
|
||||
pool.waitall()
|
308
swsync/filler.py
308
swsync/filler.py
@ -1,308 +0,0 @@
|
||||
# -*- python -*-
|
||||
# Author(s): Fabien Boucher <fabien.boucher@enovance.com>
|
||||
#
|
||||
# This script aims to fill in a swift cluster with random
|
||||
# data.
|
||||
# A custom amount of account will be created against keystone
|
||||
# then many containers and objects will be pushed to those accounts.
|
||||
# Accounts and objects will be flavored with some random meta data.
|
||||
#
|
||||
# Two indexes will be pickled to FS to store first which accounts has been
|
||||
# created (index_path) and second which containers/objects + MD5 and meta data
|
||||
# has been stored (index_containers_path).
|
||||
#
|
||||
# This script use eventlet to try to speedup the most
|
||||
# the fill in process.
|
||||
#
|
||||
# Usage:
|
||||
#
|
||||
# python swift-filler.py --create -a 10 -u 1 -f 5 -c 2 -s 5000 -l
|
||||
# The above command will create 10 accounts with one user in each (in keystone)
|
||||
# then 2 containers will be created with 5 files in each. Each file will
|
||||
# be generated with a size between 1024 Bytes to 5000 Bytes.
|
||||
#
|
||||
# python swift-filler.py --delete
|
||||
# Read pickled index file (index_path) to process a deletion
|
||||
# of objects/containers store in swift for each account then delete
|
||||
# accounts.
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
||||
import copy
|
||||
import logging
|
||||
import pickle
|
||||
import random
|
||||
import string
|
||||
import StringIO
|
||||
|
||||
from swiftclient import client as sclient
|
||||
from swiftclient.client import ClientException
|
||||
|
||||
from keystoneclient.exceptions import ClientException as KSClientException
|
||||
|
||||
import eventlet
|
||||
|
||||
sys.path.append("../")
|
||||
from utils import get_config
|
||||
|
||||
eventlet.patcher.monkey_patch()
|
||||
|
||||
# Some unicode codepoint
|
||||
ucodes = (u'\u00b5', u'\u00c6', u'\u0159', u'\u0267',
|
||||
u'\u02b6', u'\u0370', u'\u038F', u'\u03EA',
|
||||
u'\u046A')
|
||||
|
||||
|
||||
def get_rand_str(mode='user'):
|
||||
prefix = "%s" % mode
|
||||
return prefix + ''.join(random.choice(
|
||||
string.ascii_uppercase + string.digits) for x in range(8))
|
||||
|
||||
|
||||
def customize(bstr, mdl):
|
||||
if mdl == 0:
|
||||
return bstr
|
||||
elif mdl == 1:
|
||||
return bstr + " s"
|
||||
elif mdl == 2:
|
||||
return unicode(bstr, 'utf8') + u'_' + u"".\
|
||||
join([random.choice(ucodes) for i in range(3)])
|
||||
else:
|
||||
return bstr
|
||||
|
||||
|
||||
def create_swift_user(client, account_name, account_id, user_amount):
|
||||
users = []
|
||||
|
||||
def _create_user(account_name, account_id):
|
||||
user = get_rand_str(mode='user_')
|
||||
# Create a user in that tenant
|
||||
uid = client.users.create(user,
|
||||
get_config('filler',
|
||||
'default_user_password'),
|
||||
get_config('filler', 'default_user_email'),
|
||||
account_id)
|
||||
# Get swift_operator_role id
|
||||
roleid = [role.id for role in client.roles.list()
|
||||
if role.name == get_config('filler', 'swift_operator_role')]
|
||||
if not roleid:
|
||||
logging.error('Could not find swift_operator_role %s in keystone' %
|
||||
get_config('filler', 'swift_operator_role'))
|
||||
sys.exit(1)
|
||||
roleid = roleid[0]
|
||||
# Add tenant/user in swift operator role/group
|
||||
client.roles.add_user_role(uid.id, roleid, account_id)
|
||||
return (user, uid.id, roleid)
|
||||
for i in range(user_amount):
|
||||
try:
|
||||
ret = _create_user(account_name, account_id)
|
||||
logging.info('Users created %s in account %s' %
|
||||
(str(ret), account_id))
|
||||
users.append(ret)
|
||||
except KSClientException:
|
||||
logging.warn('Unable to create an user in account %s' % account_id)
|
||||
return users
|
||||
|
||||
|
||||
def create_swift_account(client, pile,
|
||||
account_amount, user_amount,
|
||||
index=None):
|
||||
|
||||
def _create_account(user_amount):
|
||||
account = get_rand_str(mode='account_')
|
||||
# Create a tenant. In swift this is an account
|
||||
try:
|
||||
account_id = client.tenants.create(account).id
|
||||
logging.info('Account created %s' % account)
|
||||
except KSClientException:
|
||||
logging.warn('Unable to create account %s' % account)
|
||||
return None, None, None
|
||||
r = create_swift_user(client, account, account_id, user_amount)
|
||||
return account, account_id, r
|
||||
created = {}
|
||||
# Spawn a greenlet for each account
|
||||
i = 0
|
||||
for i in range(account_amount):
|
||||
i += 1
|
||||
logging.info("[Keystone Start OPs %s/%s]" % (i, account_amount))
|
||||
pile.spawn(_create_account, user_amount)
|
||||
for account, account_id, ret in pile:
|
||||
if account is not None:
|
||||
index[(account, account_id)] = ret
|
||||
created[(account, account_id)] = ret
|
||||
return created
|
||||
|
||||
|
||||
def delete_account_content(acc, user):
|
||||
cnx = swift_cnx(acc, user[0])
|
||||
account_infos = cnx.get_account(full_listing=True)
|
||||
# Retrieve container list
|
||||
container_l = account_infos[1]
|
||||
containers_name = [ci['name'] for ci in container_l]
|
||||
# Retrieve object list
|
||||
for container in containers_name:
|
||||
container_infos = cnx.get_container(container)
|
||||
object_names = [obj_detail['name'] for obj_detail
|
||||
in container_infos[1]]
|
||||
# Delete objects
|
||||
for obj in object_names:
|
||||
logging.info("\
|
||||
Deleting object %s in container %s for account %s" %
|
||||
(obj, container, str(acc)))
|
||||
cnx.delete_object(container, obj)
|
||||
|
||||
|
||||
def delete_account(client, user_id, acc):
|
||||
account_id = acc[1]
|
||||
if not isinstance(user_id, list):
|
||||
user_id = (user_id,)
|
||||
for uid in user_id:
|
||||
logging.info("Delete user with id : %s" % uid)
|
||||
client.users.delete(uid)
|
||||
logging.info("Delete account %s" % account_id)
|
||||
client.tenants.delete(account_id)
|
||||
|
||||
|
||||
def swift_cnx(acc, user):
|
||||
ks_url = get_config('auth', 'keystone_origin')
|
||||
cnx = sclient.Connection(ks_url,
|
||||
user=user,
|
||||
key=get_config('filler', 'default_user_password'),
|
||||
tenant_name=acc[0],
|
||||
auth_version=2)
|
||||
return cnx
|
||||
|
||||
|
||||
def create_objects(cnx, acc, o_amount, fmax, index_containers):
|
||||
|
||||
def _generate_object(f_object, size, zero_byte=False):
|
||||
if not zero_byte:
|
||||
size = random.randint(1024, size)
|
||||
end = get_rand_str('file_end_')
|
||||
f_object.seek(size - len(end))
|
||||
f_object.write(end)
|
||||
f_object.seek(0)
|
||||
else:
|
||||
f_object.seek(0)
|
||||
containers_d = index_containers[acc]
|
||||
for container, details in containers_d.items():
|
||||
for i in range(o_amount):
|
||||
f_object = StringIO.StringIO()
|
||||
if not i and o_amount > 1:
|
||||
# Generate an empty object in each container whether
|
||||
# we create more than one object
|
||||
_generate_object(f_object, fmax, zero_byte=True)
|
||||
else:
|
||||
_generate_object(f_object, fmax)
|
||||
# Customize filename
|
||||
object_name = customize(get_rand_str('file_name_'), i % 3)
|
||||
meta_keys = [customize(m, (i + 1) % 3) for m in
|
||||
map(get_rand_str, ('X-Object-Meta-',) * 3)]
|
||||
meta_values = [customize(m, (i + 1) % 3) for m in
|
||||
map(get_rand_str, ('meta_v_',) * 3)]
|
||||
meta = dict(zip(meta_keys, meta_values))
|
||||
data = f_object.read()
|
||||
f_object.close()
|
||||
try:
|
||||
etag = cnx.put_object(container, object_name,
|
||||
data, headers=copy.copy(meta))
|
||||
logging.info("Put data for container %s "
|
||||
"(filename: %s,\tsize: %.3f KB)" %
|
||||
(container,
|
||||
object_name.encode('ascii', 'ignore'),
|
||||
float(len(data)) / 1024))
|
||||
obj_info = {'object_info':
|
||||
(object_name, etag, len(data)), 'meta': meta}
|
||||
containers_d[container]['objects'].append(obj_info)
|
||||
except ClientException:
|
||||
logging.warning('Unable to put object %s in container %s' % (
|
||||
object_name.encode('ascii', 'ignore'),
|
||||
container.encode('ascii', 'ignore')))
|
||||
|
||||
|
||||
def create_containers(cnx, acc, c_amount, index_containers=None):
|
||||
containers_d = index_containers.setdefault(acc, {})
|
||||
for i in range(c_amount):
|
||||
container_name = customize(get_rand_str('container_'), i % 3)
|
||||
# python-swiftclient does not quote correctly meta ... need
|
||||
# to investigate why it does not work when key are utf8
|
||||
# meta_keys = [customize(m, (i+1)%3) for m in
|
||||
# map(get_rand_str, ('X-Container-Meta-',) * 3)]
|
||||
meta_keys = map(get_rand_str, ('X-Container-Meta-',) * 3)
|
||||
# meta_values = map(get_rand_str, ('meta_v_',) * 3)
|
||||
meta_values = [customize(m, (i + 1) % 3) for m in
|
||||
map(get_rand_str, ('meta_v_',) * 3)]
|
||||
meta = dict(zip(meta_keys, meta_values))
|
||||
logging.info("Create container %s" %
|
||||
container_name.encode('ascii', 'ignore'))
|
||||
try:
|
||||
cnx.put_container(container_name, headers=copy.copy(meta))
|
||||
containers_d[container_name] = {'meta': meta, 'objects': []}
|
||||
except(ClientException), e:
|
||||
logging.warning("Unable to create container %s due to %s" %
|
||||
(container_name.encode('ascii', 'ignore'),
|
||||
e))
|
||||
|
||||
|
||||
def create_account_meta(cnx):
|
||||
meta_keys = []
|
||||
meta_values = []
|
||||
for i in range(3):
|
||||
# python-swiftclient does not quote correctly meta ... need
|
||||
# to investigate why it does not work when key are utf8
|
||||
#meta_keys.extend([customize(m, (i + 1) % 3) for m in
|
||||
# map(get_rand_str, ('X-Account-Meta-',) * 1)])
|
||||
meta_keys.extend(map(get_rand_str, ('X-Account-Meta-',) * 3))
|
||||
meta_values.extend([customize(m, (i + 1) % 3) for m in
|
||||
map(get_rand_str, ('meta_v_',) * 1)])
|
||||
meta = dict(zip(meta_keys, meta_values))
|
||||
cnx.post_account(headers=meta)
|
||||
|
||||
|
||||
def fill_swift(pool, created_account, c_amount,
|
||||
o_amount, fmax, index_containers=None):
|
||||
def _fill_swift_job(acc, users, c_amount,
|
||||
o_amount, fmax, index_containers):
|
||||
cnx = swift_cnx(acc, users[0][0])
|
||||
# Use the first user we find for fill in the swift account
|
||||
#TODO(fbo) must keep track of the account meta
|
||||
create_account_meta(cnx)
|
||||
create_containers(cnx, acc, c_amount, index_containers)
|
||||
create_objects(cnx, acc, o_amount, fmax, index_containers)
|
||||
i = 0
|
||||
for acc, users in created_account.items():
|
||||
i += 1
|
||||
logging.info("[Start Swift Account OPs %s/%s]" %
|
||||
(i, len(created_account.keys())))
|
||||
pool.spawn_n(_fill_swift_job,
|
||||
acc, users,
|
||||
c_amount, o_amount,
|
||||
fmax, index_containers)
|
||||
pool.waitall()
|
||||
|
||||
|
||||
def load_index():
|
||||
index_path = get_config('filler', 'index_path')
|
||||
if os.path.isfile(index_path):
|
||||
try:
|
||||
index = pickle.load(file(index_path))
|
||||
logging.info("Load previous index for account %s" % index_path)
|
||||
except Exception:
|
||||
index = {}
|
||||
else:
|
||||
index = {}
|
||||
return index
|
||||
|
||||
|
||||
def load_containers_index():
|
||||
index_containers_path = get_config('filler', 'index_containers_path')
|
||||
if os.path.isfile(index_containers_path):
|
||||
try:
|
||||
index = pickle.load(file(index_containers_path))
|
||||
logging.info("Load previous index for %s" % index_containers_path)
|
||||
except Exception:
|
||||
index = {}
|
||||
else:
|
||||
index = {}
|
||||
return index
|
@ -1,121 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import logging
|
||||
|
||||
import eventlet
|
||||
import swift.common.bufferedhttp
|
||||
import swift.common.http
|
||||
try:
|
||||
from swift.container.sync import _Iter2FileLikeObject as FileLikeIter
|
||||
except ImportError:
|
||||
# Nov2013: swift.common.utils now include a more generic object
|
||||
from swift.common.utils import FileLikeIter
|
||||
|
||||
from swiftclient import client as swiftclient
|
||||
import urllib
|
||||
import urllib2
|
||||
|
||||
|
||||
def quote(value, safe='/'):
|
||||
"""Patched version of urllib.quote.
|
||||
|
||||
Encodes utf-8 strings before quoting.
|
||||
"""
|
||||
if isinstance(value, unicode):
|
||||
value = value.encode('utf-8')
|
||||
return urllib.quote(value, safe)
|
||||
|
||||
|
||||
def get_object(storage_url, token,
|
||||
container_name,
|
||||
object_name,
|
||||
response_timeout=15,
|
||||
conn_timeout=5,
|
||||
resp_chunk_size=65536):
|
||||
headers = {'x-auth-token': token}
|
||||
x = urllib2.urlparse.urlparse(storage_url)
|
||||
|
||||
path = x.path + '/' + container_name + '/' + object_name
|
||||
path = quote(path)
|
||||
with eventlet.Timeout(conn_timeout):
|
||||
conn = swift.common.bufferedhttp.http_connect_raw(
|
||||
x.hostname,
|
||||
x.port,
|
||||
'GET',
|
||||
path,
|
||||
headers=headers,
|
||||
ssl=False)
|
||||
|
||||
with eventlet.Timeout(response_timeout):
|
||||
resp = conn.getresponse()
|
||||
|
||||
if not swift.common.http.is_success(resp.status):
|
||||
resp.read()
|
||||
# TODO(chmou): logging
|
||||
raise swiftclient.ClientException(
|
||||
'status %s %s' % (resp.status, resp.reason))
|
||||
|
||||
if resp_chunk_size:
|
||||
def _object_body():
|
||||
buf = resp.read(resp_chunk_size)
|
||||
while buf:
|
||||
yield buf
|
||||
buf = resp.read(resp_chunk_size)
|
||||
object_body = _object_body()
|
||||
else:
|
||||
object_body = resp.read()
|
||||
|
||||
resp_headers = {}
|
||||
for header, value in resp.getheaders():
|
||||
resp_headers[header.lower()] = value
|
||||
|
||||
return (resp_headers, object_body)
|
||||
|
||||
|
||||
def delete_object(dest_cnx,
|
||||
dest_token,
|
||||
container_name,
|
||||
object_name):
|
||||
parsed = dest_cnx[0]
|
||||
url = '%s://%s/%s' % (parsed.scheme, parsed.netloc, parsed.path)
|
||||
swiftclient.delete_object(url=url,
|
||||
token=dest_token,
|
||||
container=container_name,
|
||||
http_conn=dest_cnx,
|
||||
name=object_name)
|
||||
|
||||
|
||||
def sync_object(orig_storage_url, orig_token, dest_storage_url,
|
||||
dest_token, container_name, object_name_etag):
|
||||
object_name = object_name_etag[1]
|
||||
|
||||
orig_headers, orig_body = get_object(orig_storage_url,
|
||||
orig_token,
|
||||
container_name,
|
||||
object_name)
|
||||
container_name = quote(container_name)
|
||||
|
||||
post_headers = orig_headers
|
||||
post_headers['x-auth-token'] = dest_token
|
||||
sync_to = dest_storage_url + "/" + container_name
|
||||
try:
|
||||
swiftclient.put_object(sync_to, name=object_name,
|
||||
headers=post_headers,
|
||||
contents=FileLikeIter(orig_body))
|
||||
except(swiftclient.ClientException), e:
|
||||
logging.info("error sync object: %s, %s" % (
|
||||
object_name, e.http_reason))
|
@ -1,367 +0,0 @@
|
||||
# vim: tabstop=4 shiftwidth=4 softtabstop=4
|
||||
|
||||
# Copyright 2011 OpenStack Foundation.
|
||||
# Copyright 2012-2013 Hewlett-Packard Development Company, L.P.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""
|
||||
Utilities with minimum-depends for use in setup.py
|
||||
"""
|
||||
|
||||
import email
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
from setuptools.command import sdist
|
||||
|
||||
|
||||
def parse_mailmap(mailmap='.mailmap'):
|
||||
mapping = {}
|
||||
if os.path.exists(mailmap):
|
||||
with open(mailmap, 'r') as fp:
|
||||
for l in fp:
|
||||
try:
|
||||
canonical_email, alias = re.match(
|
||||
r'[^#]*?(<.+>).*(<.+>).*', l).groups()
|
||||
except AttributeError:
|
||||
continue
|
||||
mapping[alias] = canonical_email
|
||||
return mapping
|
||||
|
||||
|
||||
def _parse_git_mailmap(git_dir, mailmap='.mailmap'):
|
||||
mailmap = os.path.join(os.path.dirname(git_dir), mailmap)
|
||||
return parse_mailmap(mailmap)
|
||||
|
||||
|
||||
def canonicalize_emails(changelog, mapping):
|
||||
"""Takes in a string and an email alias mapping and replaces all
|
||||
instances of the aliases in the string with their real email.
|
||||
"""
|
||||
for alias, email_address in mapping.iteritems():
|
||||
changelog = changelog.replace(alias, email_address)
|
||||
return changelog
|
||||
|
||||
|
||||
# Get requirements from the first file that exists
|
||||
def get_reqs_from_files(requirements_files):
|
||||
for requirements_file in requirements_files:
|
||||
if os.path.exists(requirements_file):
|
||||
with open(requirements_file, 'r') as fil:
|
||||
return fil.read().split('\n')
|
||||
return []
|
||||
|
||||
|
||||
def parse_requirements(requirements_files=['requirements.txt',
|
||||
'tools/pip-requires']):
|
||||
requirements = []
|
||||
for line in get_reqs_from_files(requirements_files):
|
||||
# For the requirements list, we need to inject only the portion
|
||||
# after egg= so that distutils knows the package it's looking for
|
||||
# such as:
|
||||
# -e git://github.com/openstack/nova/master#egg=nova
|
||||
if re.match(r'\s*-e\s+', line):
|
||||
requirements.append(re.sub(r'\s*-e\s+.*#egg=(.*)$', r'\1',
|
||||
line))
|
||||
# such as:
|
||||
# http://github.com/openstack/nova/zipball/master#egg=nova
|
||||
elif re.match(r'\s*https?:', line):
|
||||
requirements.append(re.sub(r'\s*https?:.*#egg=(.*)$', r'\1',
|
||||
line))
|
||||
# -f lines are for index locations, and don't get used here
|
||||
elif re.match(r'\s*-f\s+', line):
|
||||
pass
|
||||
# argparse is part of the standard library starting with 2.7
|
||||
# adding it to the requirements list screws distro installs
|
||||
elif line == 'argparse' and sys.version_info >= (2, 7):
|
||||
pass
|
||||
else:
|
||||
requirements.append(line)
|
||||
|
||||
return requirements
|
||||
|
||||
|
||||
def parse_dependency_links(requirements_files=['requirements.txt',
|
||||
'tools/pip-requires']):
|
||||
dependency_links = []
|
||||
# dependency_links inject alternate locations to find packages listed
|
||||
# in requirements
|
||||
for line in get_reqs_from_files(requirements_files):
|
||||
# skip comments and blank lines
|
||||
if re.match(r'(\s*#)|(\s*$)', line):
|
||||
continue
|
||||
# lines with -e or -f need the whole line, minus the flag
|
||||
if re.match(r'\s*-[ef]\s+', line):
|
||||
dependency_links.append(re.sub(r'\s*-[ef]\s+', '', line))
|
||||
# lines that are only urls can go in unmolested
|
||||
elif re.match(r'\s*https?:', line):
|
||||
dependency_links.append(line)
|
||||
return dependency_links
|
||||
|
||||
|
||||
def _run_shell_command(cmd, throw_on_error=False):
|
||||
if os.name == 'nt':
|
||||
output = subprocess.Popen(["cmd.exe", "/C", cmd],
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE)
|
||||
else:
|
||||
output = subprocess.Popen(["/bin/sh", "-c", cmd],
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE)
|
||||
out = output.communicate()
|
||||
if output.returncode and throw_on_error:
|
||||
raise Exception("%s returned %d" % cmd, output.returncode)
|
||||
if len(out) == 0:
|
||||
return None
|
||||
if len(out[0].strip()) == 0:
|
||||
return None
|
||||
return out[0].strip()
|
||||
|
||||
|
||||
def _get_git_directory():
|
||||
parent_dir = os.path.dirname(__file__)
|
||||
while True:
|
||||
git_dir = os.path.join(parent_dir, '.git')
|
||||
if os.path.exists(git_dir):
|
||||
return git_dir
|
||||
parent_dir, child = os.path.split(parent_dir)
|
||||
if not child: # reached to root dir
|
||||
return None
|
||||
|
||||
|
||||
def write_git_changelog():
|
||||
"""Write a changelog based on the git changelog."""
|
||||
new_changelog = 'ChangeLog'
|
||||
git_dir = _get_git_directory()
|
||||
if not os.getenv('SKIP_WRITE_GIT_CHANGELOG'):
|
||||
if git_dir:
|
||||
git_log_cmd = 'git --git-dir=%s log' % git_dir
|
||||
changelog = _run_shell_command(git_log_cmd)
|
||||
mailmap = _parse_git_mailmap(git_dir)
|
||||
with open(new_changelog, "w") as changelog_file:
|
||||
changelog_file.write(canonicalize_emails(changelog, mailmap))
|
||||
else:
|
||||
open(new_changelog, 'w').close()
|
||||
|
||||
|
||||
def generate_authors():
|
||||
"""Create AUTHORS file using git commits."""
|
||||
jenkins_email = 'jenkins@review.(openstack|stackforge).org'
|
||||
old_authors = 'AUTHORS.in'
|
||||
new_authors = 'AUTHORS'
|
||||
git_dir = _get_git_directory()
|
||||
if not os.getenv('SKIP_GENERATE_AUTHORS'):
|
||||
if git_dir:
|
||||
# don't include jenkins email address in AUTHORS file
|
||||
git_log_cmd = ("git --git-dir=" + git_dir +
|
||||
" log --format='%aN <%aE>' | sort -u | "
|
||||
"egrep -v '" + jenkins_email + "'")
|
||||
changelog = _run_shell_command(git_log_cmd)
|
||||
signed_cmd = ("git log --git-dir=" + git_dir +
|
||||
" | grep -i Co-authored-by: | sort -u")
|
||||
signed_entries = _run_shell_command(signed_cmd)
|
||||
if signed_entries:
|
||||
new_entries = "\n".join(
|
||||
[signed.split(":", 1)[1].strip()
|
||||
for signed in signed_entries.split("\n") if signed])
|
||||
changelog = "\n".join((changelog, new_entries))
|
||||
mailmap = _parse_git_mailmap(git_dir)
|
||||
with open(new_authors, 'w') as new_authors_fh:
|
||||
new_authors_fh.write(canonicalize_emails(changelog, mailmap))
|
||||
if os.path.exists(old_authors):
|
||||
with open(old_authors, "r") as old_authors_fh:
|
||||
new_authors_fh.write('\n' + old_authors_fh.read())
|
||||
else:
|
||||
open(new_authors, 'w').close()
|
||||
|
||||
|
||||
_rst_template = """%(heading)s
|
||||
%(underline)s
|
||||
|
||||
.. automodule:: %(module)s
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
"""
|
||||
|
||||
|
||||
def get_cmdclass():
|
||||
"""Return dict of commands to run from setup.py."""
|
||||
|
||||
cmdclass = dict()
|
||||
|
||||
def _find_modules(arg, dirname, files):
|
||||
for filename in files:
|
||||
if filename.endswith('.py') and filename != '__init__.py':
|
||||
arg["%s.%s" % (dirname.replace('/', '.'),
|
||||
filename[:-3])] = True
|
||||
|
||||
class LocalSDist(sdist.sdist):
|
||||
"""Builds the ChangeLog and Authors files from VC first."""
|
||||
|
||||
def run(self):
|
||||
write_git_changelog()
|
||||
generate_authors()
|
||||
# sdist.sdist is an old style class, can't use super()
|
||||
sdist.sdist.run(self)
|
||||
|
||||
cmdclass['sdist'] = LocalSDist
|
||||
|
||||
# If Sphinx is installed on the box running setup.py,
|
||||
# enable setup.py to build the documentation, otherwise,
|
||||
# just ignore it
|
||||
try:
|
||||
from sphinx.setup_command import BuildDoc
|
||||
|
||||
class LocalBuildDoc(BuildDoc):
|
||||
|
||||
builders = ['html', 'man']
|
||||
|
||||
def generate_autoindex(self):
|
||||
print "**Autodocumenting from %s" % os.path.abspath(os.curdir)
|
||||
modules = {}
|
||||
option_dict = self.distribution.get_option_dict('build_sphinx')
|
||||
source_dir = os.path.join(option_dict['source_dir'][1], 'api')
|
||||
if not os.path.exists(source_dir):
|
||||
os.makedirs(source_dir)
|
||||
for pkg in self.distribution.packages:
|
||||
if '.' not in pkg:
|
||||
os.path.walk(pkg, _find_modules, modules)
|
||||
module_list = modules.keys()
|
||||
module_list.sort()
|
||||
autoindex_filename = os.path.join(source_dir, 'autoindex.rst')
|
||||
with open(autoindex_filename, 'w') as autoindex:
|
||||
autoindex.write(""".. toctree::
|
||||
:maxdepth: 1
|
||||
|
||||
""")
|
||||
for module in module_list:
|
||||
output_filename = os.path.join(source_dir,
|
||||
"%s.rst" % module)
|
||||
heading = "The :mod:`%s` Module" % module
|
||||
underline = "=" * len(heading)
|
||||
values = dict(module=module, heading=heading,
|
||||
underline=underline)
|
||||
|
||||
print "Generating %s" % output_filename
|
||||
with open(output_filename, 'w') as output_file:
|
||||
output_file.write(_rst_template % values)
|
||||
autoindex.write(" %s.rst\n" % module)
|
||||
|
||||
def run(self):
|
||||
if not os.getenv('SPHINX_DEBUG'):
|
||||
self.generate_autoindex()
|
||||
|
||||
for builder in self.builders:
|
||||
self.builder = builder
|
||||
self.finalize_options()
|
||||
self.project = self.distribution.get_name()
|
||||
self.version = self.distribution.get_version()
|
||||
self.release = self.distribution.get_version()
|
||||
BuildDoc.run(self)
|
||||
|
||||
class LocalBuildLatex(LocalBuildDoc):
|
||||
builders = ['latex']
|
||||
|
||||
cmdclass['build_sphinx'] = LocalBuildDoc
|
||||
cmdclass['build_sphinx_latex'] = LocalBuildLatex
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
return cmdclass
|
||||
|
||||
|
||||
def _get_revno(git_dir):
|
||||
"""Return the number of commits since the most recent tag.
|
||||
|
||||
We use git-describe to find this out, but if there are no
|
||||
tags then we fall back to counting commits since the beginning
|
||||
of time.
|
||||
"""
|
||||
describe = _run_shell_command(
|
||||
"git --git-dir=%s describe --always" % git_dir)
|
||||
if "-" in describe:
|
||||
return describe.rsplit("-", 2)[-2]
|
||||
|
||||
# no tags found
|
||||
revlist = _run_shell_command(
|
||||
"git --git-dir=%s rev-list --abbrev-commit HEAD" % git_dir)
|
||||
return len(revlist.splitlines())
|
||||
|
||||
|
||||
def _get_version_from_git(pre_version):
|
||||
"""Return a version which is equal to the tag that's on the current
|
||||
revision if there is one, or tag plus number of additional revisions
|
||||
if the current revision has no tag."""
|
||||
|
||||
git_dir = _get_git_directory()
|
||||
if git_dir:
|
||||
if pre_version:
|
||||
try:
|
||||
return _run_shell_command(
|
||||
"git --git-dir=" + git_dir + " describe --exact-match",
|
||||
throw_on_error=True).replace('-', '.')
|
||||
except Exception:
|
||||
sha = _run_shell_command(
|
||||
"git --git-dir=" + git_dir + " log -n1 --pretty=format:%h")
|
||||
return "%s.a%s.g%s" % (pre_version, _get_revno(git_dir), sha)
|
||||
else:
|
||||
return _run_shell_command(
|
||||
"git --git-dir=" + git_dir + " describe --always").replace(
|
||||
'-', '.')
|
||||
return None
|
||||
|
||||
|
||||
def _get_version_from_pkg_info(package_name):
|
||||
"""Get the version from PKG-INFO file if we can."""
|
||||
try:
|
||||
pkg_info_file = open('PKG-INFO', 'r')
|
||||
except (IOError, OSError):
|
||||
return None
|
||||
try:
|
||||
pkg_info = email.message_from_file(pkg_info_file)
|
||||
except email.MessageError:
|
||||
return None
|
||||
# Check to make sure we're in our own dir
|
||||
if pkg_info.get('Name', None) != package_name:
|
||||
return None
|
||||
return pkg_info.get('Version', None)
|
||||
|
||||
|
||||
def get_version(package_name, pre_version=None):
|
||||
"""Get the version of the project. First, try getting it from PKG-INFO, if
|
||||
it exists. If it does, that means we're in a distribution tarball or that
|
||||
install has happened. Otherwise, if there is no PKG-INFO file, pull the
|
||||
version from git.
|
||||
|
||||
We do not support setup.py version sanity in git archive tarballs, nor do
|
||||
we support packagers directly sucking our git repo into theirs. We expect
|
||||
that a source tarball be made from our git repo - or that if someone wants
|
||||
to make a source tarball from a fork of our repo with additional tags in it
|
||||
that they understand and desire the results of doing that.
|
||||
"""
|
||||
version = os.environ.get("OSLO_PACKAGE_VERSION", None)
|
||||
if version:
|
||||
return version
|
||||
version = _get_version_from_pkg_info(package_name)
|
||||
if version:
|
||||
return version
|
||||
version = _get_version_from_git(pre_version)
|
||||
if version:
|
||||
return version
|
||||
raise Exception("Versioning for this project requires either an sdist"
|
||||
" tarball, or access to an upstream git repository.")
|
@ -1,82 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import ConfigParser
|
||||
import logging
|
||||
import os
|
||||
|
||||
|
||||
CONFIG = None
|
||||
curdir = os.path.abspath(os.path.dirname(__file__))
|
||||
INIFILE = os.path.abspath(os.path.join(curdir, '..', 'etc', "config.ini"))
|
||||
SAMPLE_INIFILE = os.path.abspath(os.path.join(curdir, '..',
|
||||
'etc', "config.ini-sample"))
|
||||
|
||||
|
||||
class ConfigurationError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
def set_logging(level):
|
||||
logger = logging.getLogger()
|
||||
logger.setLevel({
|
||||
'debug': logging.DEBUG,
|
||||
'info': logging.INFO,
|
||||
'warning': logging.WARNING,
|
||||
'error': logging.ERROR,
|
||||
'critical': logging.CRITICAL}.get(
|
||||
level.lower()
|
||||
))
|
||||
loghandler = logging.StreamHandler()
|
||||
logger.addHandler(loghandler)
|
||||
logger = logging.LoggerAdapter(logger, 'swfiller')
|
||||
logformat = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
|
||||
loghandler.setFormatter(logformat)
|
||||
|
||||
|
||||
def parse_ini(inicfg=None):
|
||||
if hasattr(inicfg, 'read'):
|
||||
fp = inicfg
|
||||
elif inicfg and os.path.exists(inicfg):
|
||||
fp = open(inicfg)
|
||||
elif inicfg is None and os.path.exists(INIFILE):
|
||||
fp = open(INIFILE)
|
||||
else:
|
||||
raise ConfigurationError("Cannot found inicfg")
|
||||
|
||||
config = ConfigParser.RawConfigParser()
|
||||
config.readfp(fp)
|
||||
return config
|
||||
|
||||
|
||||
def get_config(section, option, default=None, _config=None):
|
||||
"""Get section/option from ConfigParser or print default if specified."""
|
||||
global CONFIG
|
||||
if _config:
|
||||
CONFIG = _config
|
||||
elif not CONFIG:
|
||||
CONFIG = parse_ini()
|
||||
|
||||
if not CONFIG.has_section(section):
|
||||
raise ConfigurationError("Invalid configuration, missing section: %s" %
|
||||
section)
|
||||
if CONFIG.has_option(section, option):
|
||||
return CONFIG.get(section, option)
|
||||
elif not default is None:
|
||||
return default
|
||||
else:
|
||||
raise ConfigurationError("Invalid configuration, missing "
|
||||
"section/option: %s/%s" % (section, option))
|
@ -1,97 +0,0 @@
|
||||
# -*- encoding: utf-8 -*-
|
||||
|
||||
# Copyright 2013 eNovance.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Author : "Fabien Boucher <fabien.boucher@enovance.com>"
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
#
|
||||
# Last-Modified middleware must be installed in the proxy-server
|
||||
# pipeline.
|
||||
|
||||
import swiftclient
|
||||
import unittest
|
||||
|
||||
CONF = {
|
||||
'user': ('demo:demo', 'wxcvbn'),
|
||||
'auth_url': 'http://192.168.56.101:5000/v2.0',
|
||||
}
|
||||
|
||||
|
||||
class TestLastModifiedMiddleware(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
self.user = swiftclient.client.Connection(
|
||||
CONF['auth_url'],
|
||||
CONF['user'][0],
|
||||
CONF['user'][1],
|
||||
auth_version='2.0',
|
||||
)
|
||||
self.container_name = 'container'
|
||||
self.meta = 'x-container-meta-last-modified'
|
||||
|
||||
self.user.put_container(self.container_name)
|
||||
|
||||
def _verify_meta(self, exist=True):
|
||||
cont_d = self.user.get_container(self.container_name)
|
||||
if exist:
|
||||
self.assertTrue(self.meta in cont_d[0].keys())
|
||||
epoch = cont_d[0][self.meta]
|
||||
self.assertTrue(float(epoch) > 1)
|
||||
else:
|
||||
self.assertFalse(self.meta in cont_d[0].keys())
|
||||
|
||||
def _get_meta(self):
|
||||
cont_d = self.user.get_container(self.container_name)
|
||||
return float(cont_d[0][self.meta])
|
||||
|
||||
def test_POST_container(self):
|
||||
self.user.post_container(self.container_name, {'key': 'val'})
|
||||
self._verify_meta()
|
||||
|
||||
def test_multiple_POST_container(self):
|
||||
self.user.post_container(self.container_name, {'key': 'val'})
|
||||
epoch1 = self._get_meta()
|
||||
self.user.post_container(self.container_name, {'key': 'val'})
|
||||
epoch2 = self._get_meta()
|
||||
self.assertNotEqual(epoch1, epoch2)
|
||||
|
||||
def test_GET_container(self):
|
||||
self.user.get_container(self.container_name)
|
||||
self._verify_meta(exist=False)
|
||||
|
||||
def test_PUT_object(self):
|
||||
self.user.put_object(self.container_name, 'obj_name', 'content')
|
||||
self._verify_meta()
|
||||
|
||||
def test_multiple_PUT_object(self):
|
||||
self.user.put_object(self.container_name, 'obj_name', 'content')
|
||||
epoch1 = self._get_meta()
|
||||
self.user.put_object(self.container_name, 'obj_name2', 'content')
|
||||
epoch2 = self._get_meta()
|
||||
self.assertNotEqual(epoch1, epoch2)
|
||||
|
||||
def test_DELETE_object(self):
|
||||
self.user.put_object(self.container_name, 'obj_name', 'content')
|
||||
epoch1 = self._get_meta()
|
||||
self.user.delete_object(self.container_name, 'obj_name')
|
||||
epoch2 = self._get_meta()
|
||||
self.assertNotEqual(epoch1, epoch2)
|
||||
|
||||
def tearDown(self):
|
||||
# Verify and delete container content
|
||||
cont_d = self.user.get_container(self.container_name)
|
||||
for obj in cont_d[1]:
|
||||
self.user.delete_object(self.container_name, obj['name'])
|
||||
self.user.delete_container(self.container_name)
|
@ -1,674 +0,0 @@
|
||||
# -*- encoding: utf-8 -*-
|
||||
|
||||
# Copyright 2013 eNovance.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Author : "Fabien Boucher <fabien.boucher@enovance.com>"
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
#
|
||||
# Last-Modified middleware must be installed in the proxy-server
|
||||
# pipeline.
|
||||
|
||||
# To start this functional test the admin users (on both keystone) used
|
||||
# to synchronize the destination swift must own the ResellerAdmin role in
|
||||
# keystone.
|
||||
|
||||
import eventlet
|
||||
import unittest
|
||||
|
||||
from keystoneclient.v2_0 import client as ksclient
|
||||
from swiftclient import client as sclient
|
||||
from swsync import accounts
|
||||
from swsync import filler
|
||||
from swsync.utils import get_config
|
||||
|
||||
|
||||
class TestSyncer(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
self.o_st = get_config('auth', 'keystone_origin')
|
||||
self.d_st = get_config('auth', 'keystone_dest')
|
||||
self.default_user_password = get_config('filler',
|
||||
'default_user_password')
|
||||
# Retreive configuration for filler
|
||||
self.o_admin_tenant, self.o_admin_user, self.o_admin_password = (
|
||||
get_config('auth', 'keystone_origin_admin_credentials').split(':'))
|
||||
self.sw_c_concu = int(get_config('concurrency',
|
||||
'filler_swift_client_concurrency'))
|
||||
self.ks_c_concu = int(get_config('concurrency',
|
||||
'filler_keystone_client_concurrency'))
|
||||
self.pile = eventlet.GreenPile(self.sw_c_concu)
|
||||
self.pool = eventlet.GreenPool(self.ks_c_concu)
|
||||
# Set a keystone connection to origin server
|
||||
self.o_ks_client = ksclient.Client(
|
||||
auth_url=self.o_st,
|
||||
username=self.o_admin_user,
|
||||
password=self.o_admin_password,
|
||||
tenant_name=self.o_admin_tenant)
|
||||
# Set a keystone connection to destination server
|
||||
self.d_ks_client = ksclient.Client(
|
||||
auth_url=self.d_st,
|
||||
username=self.o_admin_user,
|
||||
password=self.o_admin_password,
|
||||
tenant_name=self.o_admin_tenant)
|
||||
# Retreive admin (ResellerAdmin) token
|
||||
(self.o_admin_auth_url, self.o_admin_token) = \
|
||||
sclient.Connection(self.o_st,
|
||||
"%s:%s" % (self.o_admin_tenant,
|
||||
self.o_admin_user),
|
||||
self.o_admin_password,
|
||||
auth_version=2).get_auth()
|
||||
# Retreive admin (ResellerAdmin) token
|
||||
(self.d_admin_auth_url, self.d_admin_token) = \
|
||||
sclient.Connection(self.d_st,
|
||||
"%s:%s" % (self.o_admin_tenant,
|
||||
self.o_admin_user),
|
||||
self.o_admin_password,
|
||||
auth_version=2).get_auth()
|
||||
# Instanciate syncer
|
||||
self.swsync = accounts.Accounts()
|
||||
|
||||
def extract_created_a_u_iter(self, created):
|
||||
for ad, usd in created.items():
|
||||
account = ad[0]
|
||||
account_id = ad[1]
|
||||
# Retreive the first user as we only need one
|
||||
username = usd[0][0]
|
||||
yield account, account_id, username
|
||||
|
||||
def create_st_account_url(self, account_id):
|
||||
o_account_url = \
|
||||
self.o_admin_auth_url.split('AUTH_')[0] + 'AUTH_' + account_id
|
||||
d_account_url = \
|
||||
self.d_admin_auth_url.split('AUTH_')[0] + 'AUTH_' + account_id
|
||||
return o_account_url, d_account_url
|
||||
|
||||
def verify_aco_diff(self, alo, ald):
|
||||
# Verify account, container, object diff in HEAD struct
|
||||
for k, v in alo[0].items():
|
||||
if k not in ('x-timestamp', 'x-trans-id',
|
||||
'date', 'last-modified'):
|
||||
self.assertEqual(ald[0][k], v, msg='%s differs' % k)
|
||||
|
||||
def delete_account_cont(self, account_url, token):
|
||||
cnx = sclient.http_connection(account_url)
|
||||
al = sclient.get_account(None, token,
|
||||
http_conn=cnx,
|
||||
full_listing=True)
|
||||
for container in [c['name'] for c in al[1]]:
|
||||
ci = sclient.get_container(None, token,
|
||||
container, http_conn=cnx,
|
||||
full_listing=True)
|
||||
on = [od['name'] for od in ci[1]]
|
||||
for obj in on:
|
||||
sclient.delete_object('', token, container,
|
||||
obj, http_conn=cnx)
|
||||
sclient.delete_container('', token, container, http_conn=cnx)
|
||||
|
||||
def get_url(self, account_id, s_type):
|
||||
# Create account storage url
|
||||
o_account_url, d_account_url = self.create_st_account_url(account_id)
|
||||
if s_type == 'orig':
|
||||
url = o_account_url
|
||||
elif s_type == 'dest':
|
||||
url = d_account_url
|
||||
else:
|
||||
raise Exception('Unknown type')
|
||||
return url
|
||||
|
||||
def get_cnx(self, account_id, s_type):
|
||||
url = self.get_url(account_id, s_type)
|
||||
return sclient.http_connection(url)
|
||||
|
||||
def get_account_detail(self, account_id, token, s_type):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
return sclient.get_account(None, token,
|
||||
http_conn=cnx,
|
||||
full_listing=True)
|
||||
|
||||
def list_containers(self, account_id, token, s_type):
|
||||
cd = self.get_account_detail(account_id, token, s_type)
|
||||
return cd[1]
|
||||
|
||||
def get_container_detail(self, account_id, token, s_type, container):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
return sclient.get_container(None, token, container,
|
||||
http_conn=cnx, full_listing=True)
|
||||
|
||||
def list_objects(self, account_id, token, s_type, container):
|
||||
cd = self.get_container_detail(account_id, token, s_type, container)
|
||||
return cd[1]
|
||||
|
||||
def list_objects_in_containers(self, account_id, token, s_type):
|
||||
ret = {}
|
||||
cl = self.list_containers(account_id, token, s_type)
|
||||
for c in [c['name'] for c in cl]:
|
||||
objs = self.list_objects(account_id, token, s_type, c)
|
||||
ret[c] = objs
|
||||
return ret
|
||||
|
||||
def get_object_detail(self, account_id, token, s_type, container, obj):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
return sclient.get_object("", token, container, obj, http_conn=cnx)
|
||||
|
||||
def get_account_meta(self, account_id, token, s_type):
|
||||
d = self.get_account_detail(account_id, token, s_type)
|
||||
return {k: v for k, v in d[0].iteritems()
|
||||
if k.startswith('x-account-meta')}
|
||||
|
||||
def get_container_meta(self, account_id, token, s_type, container):
|
||||
d = self.get_container_detail(account_id, token, s_type, container)
|
||||
return {k: v for k, v in d[0].iteritems()
|
||||
if k.startswith('x-container-meta')}
|
||||
|
||||
def post_account(self, account_id, token, s_type, headers):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.post_account("", token, headers, http_conn=cnx)
|
||||
|
||||
def post_container(self, account_id, token, s_type, container, headers):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.post_container("", token, container, headers, http_conn=cnx)
|
||||
|
||||
def put_container(self, account_id, token, s_type, container):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.put_container("", token, container, http_conn=cnx)
|
||||
|
||||
def delete_container(self, account_id, token, s_type, container):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.delete_container("", token, container, http_conn=cnx)
|
||||
|
||||
def post_object(self, account_id, token, s_type, container, name, headers):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.post_object("", token, container, name, headers, http_conn=cnx)
|
||||
|
||||
def put_object(self, account_id, token, s_type, container, name, content):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.put_object("", token, container, name, content, http_conn=cnx)
|
||||
|
||||
def delete_object(self, account_id, token, s_type,
|
||||
container, name):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.delete_object("", token, container, name,
|
||||
http_conn=cnx)
|
||||
|
||||
def test_01_sync_one_empty_account(self):
|
||||
"""one empty account with meta data
|
||||
"""
|
||||
index = {}
|
||||
# create account
|
||||
self.created = filler.create_swift_account(self.o_ks_client,
|
||||
self.pile,
|
||||
1, 1, index)
|
||||
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
# post meta data on account
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
filler.create_account_meta(tenant_cnx)
|
||||
|
||||
# start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Now verify dest
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
alo = self.get_account_detail(account_id,
|
||||
self.o_admin_token, 'orig')
|
||||
ald = self.get_account_detail(account_id,
|
||||
self.d_admin_token, 'dest')
|
||||
self.verify_aco_diff(alo, ald)
|
||||
|
||||
def test_02_sync_many_empty_account(self):
|
||||
"""Many empty account with meta data
|
||||
"""
|
||||
index = {}
|
||||
# Create account
|
||||
self.created = filler.create_swift_account(self.o_ks_client,
|
||||
self.pile,
|
||||
3, 1, index)
|
||||
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
# Post meta data on account
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
filler.create_account_meta(tenant_cnx)
|
||||
|
||||
# Start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Now verify dest
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
alo = self.get_account_detail(account_id,
|
||||
self.o_admin_token, 'orig')
|
||||
ald = self.get_account_detail(account_id,
|
||||
self.d_admin_token, 'dest')
|
||||
self.verify_aco_diff(alo, ald)
|
||||
|
||||
def test_03_sync_many_accounts_with_many_containers_meta(self):
|
||||
"""Many accounts with many containers and container meta data
|
||||
"""
|
||||
index = {}
|
||||
index_container = {}
|
||||
# Create account
|
||||
self.created = filler.create_swift_account(self.o_ks_client,
|
||||
self.pile,
|
||||
3, 1, index)
|
||||
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
acc = (account, account_id)
|
||||
filler.create_containers(tenant_cnx, acc, 3, index_container)
|
||||
|
||||
# Start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Now verify dest
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
# Verify container listing
|
||||
clo = self.list_containers(account_id,
|
||||
self.o_admin_token, 'orig')
|
||||
cld = self.list_containers(account_id,
|
||||
self.d_admin_token, 'dest')
|
||||
self.assertEqual(len(clo), len(cld))
|
||||
for do in clo:
|
||||
match = [dd for dd in cld if dd['name'] == do['name']]
|
||||
self.assertEqual(len(match), 1)
|
||||
self.assertDictEqual(do, match[0])
|
||||
# Verify container details
|
||||
clo_c_names = [d['name'] for d in clo]
|
||||
for c_name in clo_c_names:
|
||||
cdo = self.get_container_detail(account_id, self.o_admin_token,
|
||||
'orig', c_name)
|
||||
cdd = self.get_container_detail(account_id, self.d_admin_token,
|
||||
'dest', c_name)
|
||||
self.verify_aco_diff(cdo, cdd)
|
||||
|
||||
def test_04_sync_many_accounts_many_containers_and_obj_meta(self):
|
||||
"""Many accounts with many containers and some object
|
||||
"""
|
||||
index = {}
|
||||
index_container = {}
|
||||
# Create account
|
||||
self.created = filler.create_swift_account(self.o_ks_client,
|
||||
self.pile,
|
||||
1, 1, index)
|
||||
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
acc = (account, account_id)
|
||||
filler.create_containers(tenant_cnx, acc, 1, index_container)
|
||||
filler.create_objects(tenant_cnx, acc, 1, 2048, index_container)
|
||||
|
||||
# Start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Now verify dest
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
# Verify container listing
|
||||
olo = self.list_objects_in_containers(account_id,
|
||||
self.o_admin_token, 'orig')
|
||||
old = self.list_objects_in_containers(account_id,
|
||||
self.d_admin_token, 'dest')
|
||||
|
||||
# Verify we have the same amount of container
|
||||
self.assertListEqual(olo.keys(), old.keys())
|
||||
# For each container
|
||||
for c, objs in olo.items():
|
||||
for obj in objs:
|
||||
# Verify first object detail returned by container
|
||||
# server
|
||||
match = [od for od in old[c] if od['name'] == obj['name']]
|
||||
self.assertEqual(len(match), 1)
|
||||
obj_d = match[0]
|
||||
self.assertDictEqual(obj, obj_d)
|
||||
# Verify object details from object server
|
||||
obj_names = [d['name'] for d in olo[c]]
|
||||
for obj_name in obj_names:
|
||||
objd_o = self.get_object_detail(account_id,
|
||||
self.o_admin_token, 'orig',
|
||||
c, obj_name)
|
||||
objd_d = self.get_object_detail(account_id,
|
||||
self.d_admin_token, 'dest',
|
||||
c, obj_name)
|
||||
self.verify_aco_diff(objd_o, objd_d)
|
||||
# Verify content
|
||||
self.assertEqual(objd_o[1], objd_d[1])
|
||||
|
||||
def test_05_account_two_passes(self):
|
||||
"""Account modified two sync passes
|
||||
"""
|
||||
index = {}
|
||||
# create account
|
||||
self.created = filler.create_swift_account(self.o_ks_client,
|
||||
self.pile,
|
||||
3, 1, index)
|
||||
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
# post meta data on account
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
filler.create_account_meta(tenant_cnx)
|
||||
|
||||
# start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Add more meta to account
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
# Modify meta data on account
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
token = tenant_cnx.get_auth()[1]
|
||||
# Remove one, modify one, and add one meta
|
||||
a_meta = self.get_account_meta(account_id,
|
||||
token,
|
||||
'orig')
|
||||
a_meta_k_names = [k.split('-')[-1] for k in a_meta]
|
||||
headers = {}
|
||||
headers['X-Account-Meta-a1'] = 'b1'
|
||||
headers["X-Remove-Account-Meta-%s" % a_meta_k_names[0]] = 'x'
|
||||
headers["X-Account-Meta-%s" % a_meta_k_names[1]] = 'b2'
|
||||
self.post_account(account_id, token,
|
||||
'orig', headers=headers)
|
||||
|
||||
# Re - start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Now verify dest
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
alo = self.get_account_detail(account_id,
|
||||
self.o_admin_token, 'orig')
|
||||
ald = self.get_account_detail(account_id,
|
||||
self.d_admin_token, 'dest')
|
||||
self.verify_aco_diff(alo, ald)
|
||||
|
||||
def test_06_container_two_passes(self):
|
||||
"""Containers modified two sync passes
|
||||
"""
|
||||
index = {}
|
||||
index_container = {}
|
||||
# Create account
|
||||
self.created = filler.create_swift_account(self.o_ks_client,
|
||||
self.pile,
|
||||
3, 1, index)
|
||||
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
acc = (account, account_id)
|
||||
filler.create_containers(tenant_cnx, acc, 3, index_container)
|
||||
|
||||
# Start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Modify container in account
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
|
||||
token = tenant_cnx.get_auth()[1]
|
||||
# Modify an existing container meta
|
||||
clo = self.list_containers(account_id,
|
||||
token, 'orig')
|
||||
co_name = clo[0]['name']
|
||||
c_meta = self.get_container_meta(account_id,
|
||||
token, 'orig',
|
||||
co_name)
|
||||
c_meta_k_names = [k.split('-')[-1] for k in c_meta]
|
||||
headers = {}
|
||||
headers['X-Container-Meta-a1'] = 'b1'
|
||||
headers["X-Remove-Container-Meta-%s" % c_meta_k_names[0]] = 'x'
|
||||
headers["X-Container-Meta-%s" % c_meta_k_names[1]] = 'b2'
|
||||
self.post_container(account_id, token,
|
||||
'orig',
|
||||
headers=headers,
|
||||
container=co_name)
|
||||
# Add a some more container
|
||||
self.put_container(account_id, token, 'orig', 'foobar')
|
||||
self.put_container(account_id, token, 'orig', 'foobar1')
|
||||
self.put_container(account_id, token, 'orig', 'foobar2')
|
||||
# Delete one container
|
||||
co_name = clo[1]['name']
|
||||
self.delete_container(account_id, token, 'orig', co_name)
|
||||
|
||||
# Re - Start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Now verify dest
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
# Verify container listing
|
||||
clo = self.list_containers(account_id,
|
||||
self.o_admin_token, 'orig')
|
||||
cld = self.list_containers(account_id,
|
||||
self.d_admin_token, 'dest')
|
||||
self.assertEqual(len(clo), len(cld))
|
||||
for do in clo:
|
||||
match = [dd for dd in cld if dd['name'] == do['name']]
|
||||
self.assertEqual(len(match), 1)
|
||||
self.assertDictEqual(do, match[0])
|
||||
# Verify container details
|
||||
clo_c_names = [d['name'] for d in clo]
|
||||
for c_name in clo_c_names:
|
||||
cdo = self.get_container_detail(account_id, self.o_admin_token,
|
||||
'orig', c_name)
|
||||
cdd = self.get_container_detail(account_id, self.d_admin_token,
|
||||
'dest', c_name)
|
||||
self.verify_aco_diff(cdo, cdd)
|
||||
|
||||
def test_07_object_two_passes(self):
|
||||
"""Objects modified two passes
|
||||
"""
|
||||
index = {}
|
||||
index_container = {}
|
||||
# Create account
|
||||
self.created = filler.create_swift_account(self.o_ks_client,
|
||||
self.pile,
|
||||
1, 1, index)
|
||||
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
acc = (account, account_id)
|
||||
filler.create_containers(tenant_cnx, acc, 1, index_container)
|
||||
filler.create_objects(tenant_cnx, acc, 3, 2048, index_container)
|
||||
|
||||
# Start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Modify objects in containers
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
|
||||
token = tenant_cnx.get_auth()[1]
|
||||
c_o = self.list_objects_in_containers(account_id, token, 'orig')
|
||||
for cont, objs in c_o.iteritems():
|
||||
for obj in objs:
|
||||
# Modify object meta
|
||||
obj_d, data = self.get_object_detail(account_id,
|
||||
token, 'orig',
|
||||
cont, obj['name'])
|
||||
meta = {k: v for k, v in obj_d.iteritems()
|
||||
if k.startswith('x-object-meta')}
|
||||
meta_k_names = [k.split('-')[-1] for k in meta]
|
||||
headers = {}
|
||||
headers['X-Object-Meta-a1'] = 'b1'
|
||||
headers["X-Remove-Object-Meta-%s" % meta_k_names[0]] = 'x'
|
||||
headers["X-Object-Meta-%s" % meta_k_names[1]] = 'b2'
|
||||
self.post_object(account_id, token,
|
||||
'orig',
|
||||
headers=headers,
|
||||
container=cont,
|
||||
name=obj['name'])
|
||||
# Create an object
|
||||
self.put_object(account_id, token, 'orig',
|
||||
cont, 'foofoo', 'barbarbar')
|
||||
self.put_object(account_id, token, 'orig',
|
||||
cont, 'foofoo1', 'barbarbar')
|
||||
self.put_object(account_id, token, 'orig',
|
||||
cont, 'foofoo2', 'barbarbar')
|
||||
|
||||
o_names = [o['name'] for o in objs]
|
||||
# Delete an object
|
||||
name = o_names[0]
|
||||
self.delete_object(account_id, token, 'orig',
|
||||
cont, name)
|
||||
|
||||
# Start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Now verify dest
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
# Verify container listing
|
||||
olo = self.list_objects_in_containers(account_id,
|
||||
self.o_admin_token, 'orig')
|
||||
old = self.list_objects_in_containers(account_id,
|
||||
self.d_admin_token, 'dest')
|
||||
|
||||
# Verify we have the same amount of container
|
||||
self.assertListEqual(olo.keys(), old.keys())
|
||||
# For each container
|
||||
for c, objs in olo.items():
|
||||
for obj in objs:
|
||||
# Verify first object detail returned by container server
|
||||
match = [od for od in old[c] if od['name'] == obj['name']]
|
||||
self.assertEqual(len(match), 1)
|
||||
obj_d = match[0]
|
||||
a = obj.copy()
|
||||
b = obj_d.copy()
|
||||
del a['last_modified']
|
||||
del b['last_modified']
|
||||
self.assertDictEqual(a, b)
|
||||
# Verify object details from object server
|
||||
obj_names = [d['name'] for d in olo[c]]
|
||||
for obj_name in obj_names:
|
||||
objd_o = self.get_object_detail(account_id,
|
||||
self.o_admin_token, 'orig',
|
||||
c, obj_name)
|
||||
objd_d = self.get_object_detail(account_id,
|
||||
self.d_admin_token, 'dest',
|
||||
c, obj_name)
|
||||
self.verify_aco_diff(objd_o, objd_d)
|
||||
# Verify content
|
||||
self.assertEqual(objd_o[1], objd_d[1])
|
||||
|
||||
def test_08_sync_containers_with_last_modified(self):
|
||||
"""Containers with last-modified middleware
|
||||
"""
|
||||
index = {}
|
||||
index_container = {}
|
||||
# Create account
|
||||
self.created = filler.create_swift_account(self.o_ks_client,
|
||||
self.pile,
|
||||
1, 1, index)
|
||||
|
||||
# Create container and store new account && container
|
||||
account_dest, container_dest = None, None
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
acc = (account, account_id)
|
||||
filler.create_containers(tenant_cnx, acc, 1, index_container)
|
||||
filler.create_objects(tenant_cnx, acc, 1, 2048, index_container)
|
||||
cld = self.list_containers(account_id,
|
||||
self.d_admin_token, 'orig')
|
||||
account_dest = account_id
|
||||
container_dest = cld[0]['name']
|
||||
break
|
||||
|
||||
# Start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Update dest
|
||||
self.put_object(account_dest, self.d_admin_token, 'dest',
|
||||
container_dest, 'lm-test', 'lm-data')
|
||||
|
||||
# Get timestamp
|
||||
cdd = self.get_container_detail(account_dest, self.d_admin_token,
|
||||
'dest', container_dest)
|
||||
try:
|
||||
dest_lm = cdd[0]['x-container-meta-last-modified']
|
||||
except(KeyError):
|
||||
# Last-modified middleware is not present
|
||||
return
|
||||
|
||||
# Restart sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Check if dest timestamp have not been updated
|
||||
cdd = self.get_container_detail(account_dest, self.d_admin_token,
|
||||
'dest', container_dest)
|
||||
self.assertEqual(dest_lm, cdd[0]['x-container-meta-last-modified'])
|
||||
|
||||
# Check if new object is still present in dest
|
||||
obj_detail = self.get_object_detail(account_dest, self.d_admin_token,
|
||||
'dest', container_dest, 'lm-test')
|
||||
self.assertEqual('lm-data', obj_detail[1])
|
||||
|
||||
def tearDown(self):
|
||||
if self.created:
|
||||
for k, v in self.created.items():
|
||||
user_info_list = [user[1] for user in v]
|
||||
account_id = k[1]
|
||||
o_account_url, d_account_url = \
|
||||
self.create_st_account_url(account_id)
|
||||
# Remove account content on origin and destination
|
||||
self.delete_account_cont(o_account_url, self.o_admin_token)
|
||||
self.delete_account_cont(d_account_url, self.d_admin_token)
|
||||
# We just need to delete keystone accounts and users
|
||||
# in origin keystone as syncer does not sync
|
||||
# keystone database
|
||||
filler.delete_account(self.o_ks_client,
|
||||
user_info_list,
|
||||
k)
|
@ -1,262 +0,0 @@
|
||||
# -*- encoding: utf-8 -*-
|
||||
|
||||
# Copyright 2013 eNovance.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Author : "Joe Hakim Rahme <joe.hakim.rahme@enovance.com>"
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
# To start this functional test the admin users (on both keystone) used
|
||||
# to synchronize the destination swift must own the ResellerAdmin role in
|
||||
# keystone.
|
||||
#
|
||||
# In your config.ini file, you should uncomment the field tenant_filter_file
|
||||
# and specify a path to a file where you're allowed to read and write.
|
||||
|
||||
import eventlet
|
||||
import random
|
||||
import unittest
|
||||
|
||||
from keystoneclient.v2_0 import client as ksclient
|
||||
from swiftclient import client as sclient
|
||||
from swsync import accounts
|
||||
from swsync import filler
|
||||
from swsync.utils import get_config
|
||||
|
||||
|
||||
class TestSyncer(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
self.o_st = get_config('auth', 'keystone_origin')
|
||||
self.d_st = get_config('auth', 'keystone_dest')
|
||||
self.default_user_password = get_config('filler',
|
||||
'default_user_password')
|
||||
# Retreive configuration for filler
|
||||
self.o_admin_tenant, self.o_admin_user, self.o_admin_password = (
|
||||
get_config('auth', 'keystone_origin_admin_credentials').split(':'))
|
||||
self.sw_c_concu = int(get_config('concurrency',
|
||||
'filler_swift_client_concurrency'))
|
||||
self.ks_c_concu = int(get_config('concurrency',
|
||||
'filler_keystone_client_concurrency'))
|
||||
self.filter_filename = get_config('sync', 'tenant_filter_file')
|
||||
self.pile = eventlet.GreenPile(self.sw_c_concu)
|
||||
self.pool = eventlet.GreenPool(self.ks_c_concu)
|
||||
# Set a keystone connection to origin server
|
||||
self.o_ks_client = ksclient.Client(
|
||||
auth_url=self.o_st,
|
||||
username=self.o_admin_user,
|
||||
password=self.o_admin_password,
|
||||
tenant_name=self.o_admin_tenant)
|
||||
# Set a keystone connection to destination server
|
||||
self.d_ks_client = ksclient.Client(
|
||||
auth_url=self.d_st,
|
||||
username=self.o_admin_user,
|
||||
password=self.o_admin_password,
|
||||
tenant_name=self.o_admin_tenant)
|
||||
# Retreive admin (ResellerAdmin) token
|
||||
(self.o_admin_auth_url, self.o_admin_token) = \
|
||||
sclient.Connection(self.o_st,
|
||||
"%s:%s" % (self.o_admin_tenant,
|
||||
self.o_admin_user),
|
||||
self.o_admin_password,
|
||||
auth_version=2).get_auth()
|
||||
# Retreive admin (ResellerAdmin) token
|
||||
(self.d_admin_auth_url, self.d_admin_token) = \
|
||||
sclient.Connection(self.d_st,
|
||||
"%s:%s" % (self.o_admin_tenant,
|
||||
self.o_admin_user),
|
||||
self.o_admin_password,
|
||||
auth_version=2).get_auth()
|
||||
# Instanciate syncer
|
||||
self.swsync = accounts.Accounts()
|
||||
|
||||
def extract_created_a_u_iter(self, created):
|
||||
for ad, usd in created.items():
|
||||
account = ad[0]
|
||||
account_id = ad[1]
|
||||
# Retreive the first user as we only need one
|
||||
username = usd[0][0]
|
||||
yield account, account_id, username
|
||||
|
||||
def create_st_account_url(self, account_id):
|
||||
o_account_url = \
|
||||
self.o_admin_auth_url.split('AUTH_')[0] + 'AUTH_' + account_id
|
||||
d_account_url = \
|
||||
self.d_admin_auth_url.split('AUTH_')[0] + 'AUTH_' + account_id
|
||||
return o_account_url, d_account_url
|
||||
|
||||
def verify_aco_diff(self, alo, ald):
|
||||
"""Verify that 2 accounts are similar to validate migration
|
||||
"""
|
||||
for k, v in alo[0].items():
|
||||
if k not in ('x-timestamp', 'x-trans-id',
|
||||
'date', 'last-modified'):
|
||||
self.assertEqual(ald[0][k], v, msg='%s differs' % k)
|
||||
|
||||
def delete_account_cont(self, account_url, token):
|
||||
cnx = sclient.http_connection(account_url)
|
||||
al = sclient.get_account(None, token,
|
||||
http_conn=cnx,
|
||||
full_listing=True)
|
||||
for container in [c['name'] for c in al[1]]:
|
||||
ci = sclient.get_container(None, token,
|
||||
container, http_conn=cnx,
|
||||
full_listing=True)
|
||||
on = [od['name'] for od in ci[1]]
|
||||
for obj in on:
|
||||
sclient.delete_object('', token, container,
|
||||
obj, http_conn=cnx)
|
||||
sclient.delete_container('', token, container, http_conn=cnx)
|
||||
|
||||
def get_url(self, account_id, s_type):
|
||||
# Create account storage url
|
||||
o_account_url, d_account_url = self.create_st_account_url(account_id)
|
||||
if s_type == 'orig':
|
||||
url = o_account_url
|
||||
elif s_type == 'dest':
|
||||
url = d_account_url
|
||||
else:
|
||||
raise Exception('Unknown type')
|
||||
return url
|
||||
|
||||
def get_cnx(self, account_id, s_type):
|
||||
url = self.get_url(account_id, s_type)
|
||||
return sclient.http_connection(url)
|
||||
|
||||
def get_account_detail(self, account_id, token, s_type):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
return sclient.get_account(None, token,
|
||||
http_conn=cnx,
|
||||
full_listing=True)
|
||||
|
||||
def list_containers(self, account_id, token, s_type):
|
||||
cd = self.get_account_detail(account_id, token, s_type)
|
||||
return cd[1]
|
||||
|
||||
def get_container_detail(self, account_id, token, s_type, container):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
return sclient.get_container(None, token, container,
|
||||
http_conn=cnx, full_listing=True)
|
||||
|
||||
def list_objects(self, account_id, token, s_type, container):
|
||||
cd = self.get_container_detail(account_id, token, s_type, container)
|
||||
return cd[1]
|
||||
|
||||
def list_objects_in_containers(self, account_id, token, s_type):
|
||||
ret = {}
|
||||
cl = self.list_containers(account_id, token, s_type)
|
||||
for c in [c['name'] for c in cl]:
|
||||
objs = self.list_objects(account_id, token, s_type, c)
|
||||
ret[c] = objs
|
||||
return ret
|
||||
|
||||
def get_object_detail(self, account_id, token, s_type, container, obj):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
return sclient.get_object("", token, container, obj, http_conn=cnx)
|
||||
|
||||
def get_account_meta(self, account_id, token, s_type):
|
||||
d = self.get_account_detail(account_id, token, s_type)
|
||||
return {k: v for k, v in d[0].iteritems()
|
||||
if k.startswith('x-account-meta')}
|
||||
|
||||
def get_container_meta(self, account_id, token, s_type, container):
|
||||
d = self.get_container_detail(account_id, token, s_type, container)
|
||||
return {k: v for k, v in d[0].iteritems()
|
||||
if k.startswith('x-container-meta')}
|
||||
|
||||
def post_account(self, account_id, token, s_type, headers):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.post_account("", token, headers, http_conn=cnx)
|
||||
|
||||
def post_container(self, account_id, token, s_type, container, headers):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.post_container("", token, container, headers, http_conn=cnx)
|
||||
|
||||
def put_container(self, account_id, token, s_type, container):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.put_container("", token, container, http_conn=cnx)
|
||||
|
||||
def delete_container(self, account_id, token, s_type, container):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.delete_container("", token, container, http_conn=cnx)
|
||||
|
||||
def post_object(self, account_id, token, s_type, container, name, headers):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.post_object("", token, container, name, headers, http_conn=cnx)
|
||||
|
||||
def put_object(self, account_id, token, s_type, container, name, content):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.put_object("", token, container, name, content, http_conn=cnx)
|
||||
|
||||
def delete_object(self, account_id, token, s_type,
|
||||
container, name):
|
||||
cnx = self.get_cnx(account_id, s_type)
|
||||
sclient.delete_object("", token, container, name,
|
||||
http_conn=cnx)
|
||||
|
||||
def test_01_sync_one_of_two_empty_accounts(self):
|
||||
"""create two empty accounts, Sync only one
|
||||
"""
|
||||
index = {}
|
||||
|
||||
# create account
|
||||
self.created = filler.create_swift_account(self.o_ks_client,
|
||||
self.pile,
|
||||
2, 1, index)
|
||||
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
|
||||
# post meta data on account
|
||||
tenant_cnx = sclient.Connection(self.o_st,
|
||||
"%s:%s" % (account, username),
|
||||
self.default_user_password,
|
||||
auth_version=2)
|
||||
filler.create_account_meta(tenant_cnx)
|
||||
|
||||
# select random account and write it in the filter file
|
||||
t_account, t_account_id, t_username = random.choice(list(
|
||||
self.extract_created_a_u_iter(self.created)))
|
||||
with open(self.filter_filename, "w") as filterlist:
|
||||
filterlist.write(t_account + "\n")
|
||||
|
||||
# start sync process
|
||||
self.swsync.process()
|
||||
|
||||
# Now verify dest
|
||||
for account, account_id, username in \
|
||||
self.extract_created_a_u_iter(self.created):
|
||||
alo = self.get_account_detail(account_id,
|
||||
self.o_admin_token, 'orig')
|
||||
ald = self.get_account_detail(account_id,
|
||||
self.d_admin_token, 'dest')
|
||||
if account == t_account:
|
||||
self.verify_aco_diff(alo, ald)
|
||||
|
||||
def tearDown(self):
|
||||
if self.created:
|
||||
for k, v in self.created.items():
|
||||
user_info_list = [user[1] for user in v]
|
||||
account_id = k[1]
|
||||
o_account_url, d_account_url = \
|
||||
self.create_st_account_url(account_id)
|
||||
# Remove account content on origin and destination
|
||||
self.delete_account_cont(o_account_url, self.o_admin_token)
|
||||
self.delete_account_cont(d_account_url, self.d_admin_token)
|
||||
# We just need to delete keystone accounts and users
|
||||
# in origin keystone as syncer does not sync
|
||||
# keystone database
|
||||
filler.delete_account(self.o_ks_client,
|
||||
user_info_list,
|
||||
k)
|
@ -1,41 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
"""Test base classes imported from ceilometer.
|
||||
"""
|
||||
|
||||
import unittest2
|
||||
|
||||
import mox
|
||||
import stubout
|
||||
from swsync import utils
|
||||
|
||||
|
||||
class TestCase(unittest2.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
super(TestCase, self).setUp()
|
||||
self.mox = mox.Mox()
|
||||
self.stubs = stubout.StubOutForTesting()
|
||||
utils.CONFIG = utils.parse_ini(utils.SAMPLE_INIFILE)
|
||||
|
||||
def tearDown(self):
|
||||
self.mox.UnsetStubs()
|
||||
self.stubs.UnsetAll()
|
||||
self.stubs.SmartUnsetAll()
|
||||
self.mox.VerifyAll()
|
||||
super(TestCase, self).tearDown()
|
@ -1,196 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import datetime
|
||||
import random
|
||||
import urlparse
|
||||
import uuid
|
||||
|
||||
from swsync.utils import ConfigurationError
|
||||
|
||||
STORAGE_ORIG = 'http://storage-orig.com'
|
||||
STORAGE_DEST = 'http://storage-dest.com'
|
||||
|
||||
TENANTS_LIST = {'foo1': {'id': uuid.uuid4().hex},
|
||||
'foo2': {'id': uuid.uuid4().hex},
|
||||
'foo3': {'id': uuid.uuid4().hex}}
|
||||
|
||||
|
||||
def gen_random_lastmodified():
|
||||
delta = datetime.timedelta(seconds=random.randint(1, 60))
|
||||
return str(datetime.datetime.now() + delta)
|
||||
|
||||
|
||||
def gen_container(x):
|
||||
return {'name': x,
|
||||
'bytes': random.randint(1, 5000),
|
||||
'count': random.randint(1, 50),
|
||||
'bytes': random.randint(1, 5000)}
|
||||
|
||||
|
||||
def gen_object(x):
|
||||
return {'bytes': random.randint(1, 5000),
|
||||
'last_modified': gen_random_lastmodified(),
|
||||
'name': x}
|
||||
|
||||
CONTAINERS_LIST = [
|
||||
(gen_container('cont1'),
|
||||
[gen_object('obj%s' % (x)) for x in xrange(random.randint(1, 10))]),
|
||||
(gen_container('cont2'),
|
||||
[gen_object('obj%s' % (x)) for x in xrange(random.randint(1, 10))]),
|
||||
(gen_container('cont3'),
|
||||
[gen_object('obj%s' % (x)) for x in xrange(random.randint(1, 10))]),
|
||||
]
|
||||
|
||||
CONTAINER_HEADERS = {
|
||||
'x-foo': 'true', 'x-bar': 'bar',
|
||||
'x-container-object-count': '10',
|
||||
'x-container-bytes-used': '1000000',
|
||||
'x-trans-id': 'transid',
|
||||
}
|
||||
|
||||
CONFIGDICT = {'auth':
|
||||
{'keystone_origin': STORAGE_ORIG,
|
||||
'keystone_origin_admin_credentials': 'foo1:bar:kernel',
|
||||
'keystone_dest': STORAGE_DEST}}
|
||||
|
||||
|
||||
def fake_get_config(section, option):
|
||||
try:
|
||||
return CONFIGDICT[section][option]
|
||||
except KeyError:
|
||||
raise ConfigurationError
|
||||
|
||||
|
||||
def fake_get_filter(self):
|
||||
return {'foo1', 'foo2', 'foo3'}
|
||||
|
||||
|
||||
class FakeSWConnection(object):
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.mainargs = args
|
||||
self.mainkwargs = kwargs
|
||||
|
||||
def get_auth(self, *args, **kwargs):
|
||||
tenant, user = self.mainargs[1].split(':')
|
||||
tenant_id = TENANTS_LIST[tenant]['id']
|
||||
return ('%s/v1/AUTH_%s' % (STORAGE_DEST, tenant_id), 'token')
|
||||
|
||||
def get_container(*args, **kargs):
|
||||
pass
|
||||
|
||||
def delete_object(self, *args, **kargs):
|
||||
pass
|
||||
|
||||
def put_container(self, *args, **kargs):
|
||||
pass
|
||||
|
||||
def put_object(self, *args, **kargs):
|
||||
pass
|
||||
|
||||
def get_account(self, *args, **kargs):
|
||||
pass
|
||||
|
||||
def post_account(self, *args, **kargs):
|
||||
pass
|
||||
|
||||
|
||||
class FakeSWObject(object):
|
||||
def __init__(self, object_name):
|
||||
pass
|
||||
|
||||
|
||||
class FakeSWClient(object):
|
||||
@staticmethod
|
||||
def http_connection(url):
|
||||
return (urlparse.urlparse(url), None)
|
||||
|
||||
|
||||
def fake_get_auth(auth_url, tenant, user, password):
|
||||
return FakeSWConnection(
|
||||
auth_url,
|
||||
'%s:%s' % (tenant, user),
|
||||
password,
|
||||
auth_version=2).get_auth()
|
||||
|
||||
|
||||
class FakeKSTenant(object):
|
||||
def __init__(self, tenant_name):
|
||||
self.tenant_name = tenant_name
|
||||
self.name = tenant_name
|
||||
|
||||
@property
|
||||
def id(self):
|
||||
return TENANTS_LIST[self.tenant_name]['id']
|
||||
|
||||
def __str__(self):
|
||||
return self.tenant_name
|
||||
|
||||
|
||||
class FakeKSUser(object):
|
||||
def __init__(self):
|
||||
self.id = uuid.uuid4().hex
|
||||
|
||||
|
||||
class FakeKSClientUsers(object):
|
||||
def create(self, *args):
|
||||
return FakeKSUser()
|
||||
|
||||
def delete(self, *args):
|
||||
pass
|
||||
|
||||
|
||||
class FakeKSRole(object):
|
||||
def __init__(self):
|
||||
self.id = uuid.uuid4().hex
|
||||
self.name = 'Member'
|
||||
|
||||
|
||||
class FakeKSClientRoles(object):
|
||||
def add_user_role(self, *args):
|
||||
pass
|
||||
|
||||
def list(self):
|
||||
return [FakeKSRole(), ]
|
||||
|
||||
|
||||
class FakeKSClientTenant(object):
|
||||
def list(self):
|
||||
for t in list(TENANTS_LIST):
|
||||
yield FakeKSTenant(t)
|
||||
|
||||
def create(self, account):
|
||||
return FakeKSTenant(TENANTS_LIST.keys()[0])
|
||||
|
||||
def delete(self, *args):
|
||||
pass
|
||||
|
||||
|
||||
class FakeKSClient(object):
|
||||
def __init__(self, *args):
|
||||
self.args = args
|
||||
self.tenants = FakeKSClientTenant()
|
||||
self.roles = FakeKSClientRoles()
|
||||
self.users = FakeKSClientUsers()
|
||||
|
||||
def __call__(self):
|
||||
return self.args
|
||||
|
||||
|
||||
class FakeKS(object):
|
||||
@staticmethod
|
||||
def Client(*args, **kwargs):
|
||||
return FakeKSClient(args, kwargs)
|
@ -1,318 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import logging
|
||||
|
||||
import keystoneclient
|
||||
import swiftclient
|
||||
|
||||
import swsync.accounts
|
||||
import tests.units.base
|
||||
import tests.units.fakes as fakes
|
||||
|
||||
|
||||
class TestAccountBase(tests.units.base.TestCase):
|
||||
def setUp(self):
|
||||
super(TestAccountBase, self).setUp()
|
||||
self.accounts_cls = swsync.accounts.Accounts()
|
||||
self._stubs()
|
||||
|
||||
def _stubs(self):
|
||||
self.stubs.Set(keystoneclient.v2_0, 'client', fakes.FakeKS)
|
||||
self.stubs.Set(swiftclient.client, 'Connection',
|
||||
fakes.FakeSWConnection)
|
||||
self.stubs.Set(swsync.accounts, 'get_config', fakes.fake_get_config)
|
||||
self.stubs.Set(swsync.accounts.Accounts, 'get_target_tenant_filter',
|
||||
fakes.fake_get_filter)
|
||||
self.stubs.Set(swiftclient, 'http_connection',
|
||||
fakes.FakeSWClient.http_connection)
|
||||
|
||||
|
||||
class TestAccountSyncMetadata(TestAccountBase):
|
||||
def _base_sync_metadata(self, orig_dict={},
|
||||
dest_dict={},
|
||||
get_account_called=[],
|
||||
post_account_called=[],
|
||||
info_called=[],
|
||||
sync_container_called=[],
|
||||
raise_post_account=False):
|
||||
|
||||
def fake_info(msg, *args):
|
||||
info_called.append(msg)
|
||||
self.stubs.Set(logging, 'info', fake_info)
|
||||
|
||||
def get_account(self, *args, **kwargs):
|
||||
if len(get_account_called) == 0:
|
||||
get_account_called.append(args)
|
||||
return orig_dict
|
||||
else:
|
||||
get_account_called.append(args)
|
||||
return dest_dict
|
||||
self.stubs.Set(swiftclient, 'get_account', get_account)
|
||||
|
||||
def post_account(url, token, headers, **kwargs):
|
||||
post_account_called.append(headers)
|
||||
|
||||
if raise_post_account:
|
||||
raise swiftclient.client.ClientException("Error in testing")
|
||||
self.stubs.Set(swiftclient, 'post_account', post_account)
|
||||
|
||||
class Containers(object):
|
||||
def sync(*args, **kwargs):
|
||||
sync_container_called.append(args)
|
||||
|
||||
def delete_container(*args, **kwargs):
|
||||
pass
|
||||
|
||||
self.accounts_cls.container_cls = Containers()
|
||||
self.accounts_cls.sync_account("http://orig", "otoken",
|
||||
"http://dest", "dtoken")
|
||||
|
||||
def test_sync_metadata_delete_dest(self):
|
||||
get_account_called = []
|
||||
sync_container_called = []
|
||||
post_account_called = []
|
||||
info_called = []
|
||||
|
||||
orig_dict = ({'x-account-meta-life': 'beautiful',
|
||||
'x-account-container-count': 1},
|
||||
[{'name': 'cont1'}])
|
||||
|
||||
dest_dict = ({'x-account-meta-vita': 'bella',
|
||||
'x-account-container-count': 1},
|
||||
[{'name': 'cont1'}])
|
||||
self._base_sync_metadata(orig_dict,
|
||||
dest_dict,
|
||||
info_called=info_called,
|
||||
sync_container_called=sync_container_called,
|
||||
post_account_called=post_account_called,
|
||||
get_account_called=get_account_called)
|
||||
|
||||
self.assertEqual(len(sync_container_called), 1)
|
||||
self.assertEqual(len(get_account_called), 2)
|
||||
self.assertTrue(info_called)
|
||||
|
||||
self.assertIn('x-account-meta-life',
|
||||
post_account_called[0])
|
||||
self.assertEqual(post_account_called[0]['x-account-meta-life'],
|
||||
'beautiful')
|
||||
self.assertIn('x-account-meta-vita',
|
||||
post_account_called[0])
|
||||
self.assertEqual(post_account_called[0]['x-account-meta-vita'],
|
||||
'')
|
||||
|
||||
def test_sync_metadata_update_dest(self):
|
||||
get_account_called = []
|
||||
sync_container_called = []
|
||||
post_account_called = []
|
||||
info_called = []
|
||||
|
||||
orig_dict = ({'x-account-meta-life': 'beautiful',
|
||||
'x-account-container-count': 1},
|
||||
[{'name': 'cont1'}])
|
||||
|
||||
dest_dict = ({'x-account-meta-life': 'bella',
|
||||
'x-account-container-count': 1},
|
||||
[{'name': 'cont1'}])
|
||||
self._base_sync_metadata(orig_dict,
|
||||
dest_dict,
|
||||
info_called=info_called,
|
||||
sync_container_called=sync_container_called,
|
||||
post_account_called=post_account_called,
|
||||
get_account_called=get_account_called)
|
||||
|
||||
self.assertEqual(len(sync_container_called), 1)
|
||||
self.assertEqual(len(get_account_called), 2)
|
||||
self.assertTrue(info_called)
|
||||
|
||||
self.assertIn('x-account-meta-life',
|
||||
post_account_called[0])
|
||||
self.assertEqual(post_account_called[0]['x-account-meta-life'],
|
||||
'beautiful')
|
||||
|
||||
self.assertIn('x-account-meta-life',
|
||||
post_account_called[0])
|
||||
self.assertEqual(post_account_called[0]['x-account-meta-life'],
|
||||
'beautiful')
|
||||
|
||||
def test_sync_metadata_add_to_dest(self):
|
||||
info_called = []
|
||||
get_account_called = []
|
||||
sync_container_called = []
|
||||
post_account_called = []
|
||||
|
||||
orig_dict = ({'x-account-meta-life': 'beautiful',
|
||||
'x-account-container-count': 1},
|
||||
[{'name': 'cont1'}])
|
||||
|
||||
dest_dict = ({'x-account-container-count': 1},
|
||||
[{'name': 'cont1'}])
|
||||
self._base_sync_metadata(orig_dict,
|
||||
dest_dict,
|
||||
info_called=info_called,
|
||||
sync_container_called=sync_container_called,
|
||||
post_account_called=post_account_called,
|
||||
get_account_called=get_account_called)
|
||||
|
||||
self.assertEqual(len(sync_container_called), 1)
|
||||
self.assertEqual(len(get_account_called), 2)
|
||||
self.assertTrue(info_called)
|
||||
|
||||
self.assertIn('x-account-meta-life',
|
||||
post_account_called[0])
|
||||
self.assertEqual(post_account_called[0]['x-account-meta-life'],
|
||||
'beautiful')
|
||||
|
||||
self.assertIn('x-account-meta-life',
|
||||
post_account_called[0])
|
||||
self.assertEqual(post_account_called[0]['x-account-meta-life'],
|
||||
'beautiful')
|
||||
|
||||
def test_sync_metadata_raise(self):
|
||||
info_called = []
|
||||
get_account_called = []
|
||||
sync_container_called = []
|
||||
post_account_called = []
|
||||
|
||||
orig_dict = ({'x-account-meta-life': 'beautiful',
|
||||
'x-account-container-count': 1},
|
||||
[{'name': 'cont1'}])
|
||||
|
||||
dest_dict = ({'x-account-container-count': 1},
|
||||
[{'name': 'cont1'}])
|
||||
self._base_sync_metadata(orig_dict,
|
||||
dest_dict,
|
||||
info_called=info_called,
|
||||
sync_container_called=sync_container_called,
|
||||
post_account_called=post_account_called,
|
||||
get_account_called=get_account_called,
|
||||
raise_post_account=True)
|
||||
self.assertTrue(info_called)
|
||||
self.assertIn('ERROR: updating container metadata: orig, ',
|
||||
info_called)
|
||||
self.assertFalse(sync_container_called)
|
||||
|
||||
|
||||
class TestAccountSync(TestAccountBase):
|
||||
def test_get_swift_auth(self):
|
||||
tenant_name = 'foo1'
|
||||
ret = self.accounts_cls.get_swift_auth(
|
||||
"http://test.com", tenant_name, "user", "password")
|
||||
tenant_id = fakes.TENANTS_LIST[tenant_name]['id']
|
||||
self.assertEqual(ret[0], "%s/v1/AUTH_%s" % (fakes.STORAGE_DEST,
|
||||
tenant_id))
|
||||
|
||||
def test_get_ks_auth_orig(self):
|
||||
_, kwargs = self.accounts_cls.get_ks_auth_orig()()
|
||||
k = fakes.CONFIGDICT['auth']['keystone_origin_admin_credentials']
|
||||
tenant_name, username, password = k.split(':')
|
||||
|
||||
self.assertEqual(kwargs['tenant_name'], tenant_name)
|
||||
self.assertEqual(kwargs['username'], username)
|
||||
self.assertEqual(kwargs['password'], password)
|
||||
k = fakes.CONFIGDICT['auth']['keystone_origin']
|
||||
self.assertEqual(k, kwargs['auth_url'])
|
||||
|
||||
def test_process(self):
|
||||
ret = []
|
||||
|
||||
def sync_account(orig_storage_url,
|
||||
orig_token,
|
||||
dest_storage_url,
|
||||
dest_token):
|
||||
ret.append((orig_storage_url, dest_storage_url))
|
||||
self.accounts_cls.sync_account = sync_account
|
||||
self.accounts_cls.process()
|
||||
tenant_list_ids = sorted(fakes.TENANTS_LIST[x]['id']
|
||||
for x in fakes.TENANTS_LIST)
|
||||
ret_orig_storage_id = sorted(
|
||||
x[0][x[0].find('AUTH_') + 5:] for x in ret)
|
||||
self.assertEqual(tenant_list_ids, ret_orig_storage_id)
|
||||
[self.assertTrue(y[1].startswith(fakes.STORAGE_DEST)) for y in ret]
|
||||
|
||||
def test_sync_account(self):
|
||||
ret = []
|
||||
|
||||
def get_account(*args, **kwargs):
|
||||
return ({'x-account-container-count': len(fakes.CONTAINERS_LIST)},
|
||||
[x[0] for x in fakes.CONTAINERS_LIST])
|
||||
self.stubs.Set(swiftclient, 'get_account', get_account)
|
||||
|
||||
class Containers(object):
|
||||
def sync(*args, **kwargs):
|
||||
ret.append(args)
|
||||
|
||||
def delete_container(*args, **kwargs):
|
||||
pass
|
||||
self.accounts_cls.container_cls = Containers()
|
||||
|
||||
tenant_name = fakes.TENANTS_LIST.keys()[0]
|
||||
tenant_id = fakes.TENANTS_LIST[tenant_name]['id']
|
||||
orig_storage_url = "%s/AUTH_%s" % (fakes.STORAGE_ORIG,
|
||||
tenant_id)
|
||||
dest_storage_url = "%s/AUTH_%s" % (fakes.STORAGE_DEST,
|
||||
tenant_id)
|
||||
self.accounts_cls.sync_account(orig_storage_url, "otoken",
|
||||
dest_storage_url, "dtoken")
|
||||
ret_container_list = sorted(x[7] for x in ret)
|
||||
default_container_list = sorted(x[0]['name']
|
||||
for x in fakes.CONTAINERS_LIST)
|
||||
self.assertEqual(ret_container_list, default_container_list)
|
||||
|
||||
def test_sync_exception_get_account(self):
|
||||
called = []
|
||||
|
||||
def fake_info(self, *args):
|
||||
called.append("called")
|
||||
|
||||
def get_account(*args, **kwargs):
|
||||
raise swiftclient.client.ClientException("TESTED")
|
||||
self.stubs.Set(swiftclient, 'get_account', get_account)
|
||||
self.stubs.Set(logging, 'info', fake_info)
|
||||
self.accounts_cls.sync_account("http://foo", "token",
|
||||
"http://bar", "token2")
|
||||
self.assertTrue(called)
|
||||
|
||||
def test_sync_account_detect_we_need_to_delete_some_stuff(self):
|
||||
# I should get my lazy ass up and just use self.mox stuff
|
||||
ret = []
|
||||
called = []
|
||||
|
||||
class Containers():
|
||||
def delete_container(*args, **kwargs):
|
||||
called.append("TESTED")
|
||||
|
||||
def sync(*args, **kwargs):
|
||||
pass
|
||||
|
||||
self.accounts_cls.container_cls = Containers()
|
||||
|
||||
def get_account(*args, **kwargs):
|
||||
#ORIG
|
||||
if len(ret) == 0:
|
||||
ret.append("TESTED")
|
||||
return ({'x-account-container-count': 1},
|
||||
[{'name': 'foo'}])
|
||||
#DEST
|
||||
else:
|
||||
return ({'x-account-container-count': 2},
|
||||
[{'name': 'foo', 'name': 'bar'}])
|
||||
|
||||
raise swiftclient.client.ClientException("TESTED")
|
||||
self.stubs.Set(swiftclient, 'get_account', get_account)
|
||||
self.accounts_cls.sync_account("http://foo", "token",
|
||||
"http://bar", "token2")
|
||||
self.assertTrue(called)
|
@ -1,454 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import logging
|
||||
import urlparse
|
||||
|
||||
import swiftclient
|
||||
|
||||
import swsync.containers
|
||||
|
||||
import tests.units.base as test_base
|
||||
import tests.units.fakes as fakes
|
||||
|
||||
|
||||
class TestContainersBase(test_base.TestCase):
|
||||
def setUp(self):
|
||||
super(TestContainersBase, self).setUp()
|
||||
self.container_cls = swsync.containers.Containers()
|
||||
|
||||
self.tenant_name = 'foo1'
|
||||
self.tenant_id = fakes.TENANTS_LIST[self.tenant_name]['id']
|
||||
self.orig_storage_url = '%s/AUTH_%s' % (fakes.STORAGE_ORIG,
|
||||
self.tenant_id)
|
||||
self.orig_storage_cnx = (urlparse.urlparse(self.orig_storage_url),
|
||||
None)
|
||||
self.dest_storage_url = '%s/AUTH_%s' % (fakes.STORAGE_DEST,
|
||||
self.tenant_id)
|
||||
self.dest_storage_cnx = (urlparse.urlparse(self.dest_storage_url),
|
||||
None)
|
||||
|
||||
|
||||
class TestContainersSyncMetadata(TestContainersBase):
|
||||
|
||||
def _base_sync_metadata(self, orig_dict={},
|
||||
dest_dict={},
|
||||
get_called=[],
|
||||
post_called=[],
|
||||
info_called=[],
|
||||
error_called=[],
|
||||
raise_post_container=False):
|
||||
|
||||
def fake_info(msg):
|
||||
info_called.append(msg)
|
||||
|
||||
def fake_error(msg):
|
||||
error_called.append(msg)
|
||||
|
||||
def get_container(*args, **kwargs):
|
||||
if len(get_called) == 0:
|
||||
get_called.append("TESTED")
|
||||
return orig_dict
|
||||
else:
|
||||
get_called.append("TESTED2")
|
||||
return dest_dict
|
||||
self.stubs.Set(swiftclient, 'get_container', get_container)
|
||||
|
||||
def post_container(url, token, container, headers, **kwargs):
|
||||
post_called.append(headers)
|
||||
|
||||
if raise_post_container:
|
||||
raise swiftclient.client.ClientException("Error in testing")
|
||||
self.stubs.Set(swiftclient, 'post_container', post_container)
|
||||
|
||||
def head_container(*args, **kwargs):
|
||||
pass
|
||||
|
||||
self.stubs.Set(swiftclient, 'head_container', head_container)
|
||||
self.stubs.Set(logging, 'info', fake_info)
|
||||
self.stubs.Set(logging, 'error', fake_error)
|
||||
|
||||
self.container_cls.sync(self.orig_storage_cnx,
|
||||
self.orig_storage_url,
|
||||
'token',
|
||||
self.dest_storage_cnx,
|
||||
self.dest_storage_url,
|
||||
'token', 'cont1')
|
||||
|
||||
def test_sync_containers_metada_added_on_dest(self):
|
||||
get_called = []
|
||||
post_called = []
|
||||
info_called = []
|
||||
|
||||
orig_dict = ({'x-container-meta-om': 'enkl',
|
||||
'x-trans-id': 'ffs',
|
||||
'x-container-bytes-used': '100',
|
||||
'x-container-object-count': '2'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
dest_dict = ({'x-container-meta-om': 'enkl',
|
||||
'x-container-meta-psg': 'magique',
|
||||
'x-trans-id': 'ffs',
|
||||
'x-container-bytes-used': '200',
|
||||
'x-container-object-count': '2'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
|
||||
self._base_sync_metadata(orig_dict, dest_dict, get_called,
|
||||
post_called, info_called)
|
||||
self.assertEqual(len(get_called), 2)
|
||||
self.assertEqual(len(post_called), 1)
|
||||
self.assertEqual(post_called[0]['x-container-meta-psg'], '')
|
||||
self.assertEqual(post_called[0]['x-container-meta-om'], 'enkl')
|
||||
self.assertIn('HEADER: sync headers: cont1', info_called)
|
||||
|
||||
def test_sync_containers_metada_added_on_orig(self):
|
||||
get_called = []
|
||||
post_called = []
|
||||
info_called = []
|
||||
|
||||
orig_dict = ({'x-container-meta-om': 'enkl',
|
||||
'x-trans-id': 'ffs',
|
||||
'x-container-bytes-used': '100',
|
||||
'x-container-object-count': '2'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
|
||||
dest_dict = ({'x-trans-id': 'ffs',
|
||||
'x-container-bytes-used': '200',
|
||||
'x-container-object-count': '2'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
|
||||
self._base_sync_metadata(orig_dict, dest_dict, get_called,
|
||||
post_called, info_called)
|
||||
|
||||
self.assertIn('HEADER: sync headers: cont1', info_called)
|
||||
self.assertEqual(len(get_called), 2)
|
||||
self.assertEqual(len(post_called), 1)
|
||||
self.assertEqual(post_called[0]['x-container-meta-om'], 'enkl')
|
||||
|
||||
def test_sync_containers_metada_changed(self):
|
||||
get_called = []
|
||||
post_called = []
|
||||
info_called = []
|
||||
|
||||
orig_dict = ({'x-container-meta-psg': 'magic',
|
||||
'x-trans-id': 'ffs',
|
||||
'x-container-bytes-used': '100',
|
||||
'x-container-object-count': '2'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
|
||||
dest_dict = ({'x-container-meta-psg': 'marseille',
|
||||
'x-trans-id': 'ffs',
|
||||
'x-container-bytes-used': '200',
|
||||
'x-container-object-count': '2'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
|
||||
self._base_sync_metadata(orig_dict, dest_dict, get_called,
|
||||
post_called, info_called)
|
||||
self.assertEqual(len(get_called), 2)
|
||||
self.assertEqual(len(post_called), 1)
|
||||
self.assertEqual(post_called[0]['x-container-meta-psg'], 'magic')
|
||||
self.assertIn('HEADER: sync headers: cont1', info_called)
|
||||
|
||||
def test_sync_containers_metadata_raise_client(self):
|
||||
get_called = []
|
||||
post_called = []
|
||||
info_called = []
|
||||
|
||||
orig_dict = ({'x-container-meta-psg': 'magic',
|
||||
'x-trans-id': 'ffs',
|
||||
'x-container-bytes-used': '100',
|
||||
'x-container-object-count': '2'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
|
||||
dest_dict = ({'x-container-meta-psg': 'marseille',
|
||||
'x-trans-id': 'ffs',
|
||||
'x-container-bytes-used': '200',
|
||||
'x-container-object-count': '2'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
|
||||
self._base_sync_metadata(orig_dict, dest_dict,
|
||||
get_called, post_called,
|
||||
info_called, raise_post_container=True)
|
||||
self.assertIn('ERROR: updating container metadata: cont1, ',
|
||||
info_called)
|
||||
|
||||
def test_sync_containers_last_modified(self):
|
||||
get_called = []
|
||||
post_called = []
|
||||
info_called = []
|
||||
|
||||
orig_dict = ({'x-container-bytes-used': '100',
|
||||
'x-container-object-count': '2',
|
||||
'x-container-meta-last-modified': '1'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
dest_dict = ({'x-container-bytes-used': '200',
|
||||
'x-container-object-count': '2',
|
||||
'x-container-meta-last-modified': '2'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
self._base_sync_metadata(orig_dict, dest_dict,
|
||||
get_called, post_called,
|
||||
info_called, raise_post_container=True)
|
||||
self.assertIn('Dest is up-to-date', info_called)
|
||||
|
||||
def test_sync_containers_last_modified_errors(self):
|
||||
get_called = []
|
||||
post_called = []
|
||||
error_called = []
|
||||
|
||||
orig_dict = ({'x-container-bytes-used': '100',
|
||||
'x-container-object-count': '2',
|
||||
'x-container-meta-last-modified': 'foo42'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
dest_dict = ({'x-container-bytes-used': '200',
|
||||
'x-container-object-count': '2',
|
||||
'x-container-meta-last-modified': 'foo43'},
|
||||
[{'last_modified': '2010', 'name': 'foo'}])
|
||||
self._base_sync_metadata(orig_dict, dest_dict,
|
||||
get_called, post_called,
|
||||
error_called=error_called,
|
||||
raise_post_container=True)
|
||||
self.assertIn('Could not decode last-modified header!', error_called)
|
||||
|
||||
|
||||
class TestContainers(TestContainersBase):
|
||||
def test_sync_when_container_nothere(self):
|
||||
get_cnt_called = []
|
||||
|
||||
def put_container(*args, **kwargs):
|
||||
get_cnt_called.append(args)
|
||||
|
||||
def head_container(*args, **kwargs):
|
||||
raise swiftclient.client.ClientException('Not Here')
|
||||
|
||||
def get_container(_, token, name, **kwargs):
|
||||
for clist in fakes.CONTAINERS_LIST:
|
||||
if clist[0]['name'] == name:
|
||||
return (fakes.CONTAINER_HEADERS, clist[1])
|
||||
|
||||
self.stubs.Set(swiftclient, 'get_container', get_container)
|
||||
self.stubs.Set(swiftclient, 'put_container', put_container)
|
||||
self.stubs.Set(swiftclient, 'head_container', head_container)
|
||||
|
||||
self.container_cls.sync(
|
||||
self.orig_storage_cnx, self.orig_storage_url, 'token',
|
||||
self.dest_storage_cnx, self.dest_storage_url, 'token',
|
||||
'cont1'
|
||||
)
|
||||
self.assertEqual(len(get_cnt_called), 1)
|
||||
|
||||
def test_sync_when_container_nothere_raise_when_cant_create(self):
|
||||
put_cnt_called = []
|
||||
called_info = []
|
||||
|
||||
def fake_info(self, *args):
|
||||
called_info.append("called")
|
||||
self.stubs.Set(logging, 'info', fake_info)
|
||||
|
||||
def put_container(*args, **kwargs):
|
||||
put_cnt_called.append("TESTED")
|
||||
raise swiftclient.client.ClientException('TESTED')
|
||||
|
||||
def get_container(_, token, name, **kwargS):
|
||||
for clist in fakes.CONTAINERS_LIST:
|
||||
if clist[0]['name'] == name:
|
||||
return (fakes.CONTAINER_HEADERS, clist[1])
|
||||
|
||||
def head_container(*args, **kwargs):
|
||||
raise swiftclient.client.ClientException('Not Here')
|
||||
|
||||
self.stubs.Set(swiftclient, 'get_container', get_container)
|
||||
self.stubs.Set(swiftclient, 'put_container', put_container)
|
||||
self.stubs.Set(swiftclient, 'head_container', head_container)
|
||||
|
||||
self.container_cls.sync(
|
||||
self.orig_storage_cnx, self.orig_storage_url, 'token',
|
||||
self.dest_storage_cnx, self.dest_storage_url, 'token',
|
||||
'cont1'
|
||||
)
|
||||
self.assertEqual(len(put_cnt_called), 1)
|
||||
self.assertEqual(len(called_info), 1)
|
||||
|
||||
def test_delete_dest(self):
|
||||
# probably need to change that to mox properly
|
||||
get_cnt_called = []
|
||||
sync_object_called = []
|
||||
delete_object_called = []
|
||||
|
||||
def delete_object(*args, **kwargs):
|
||||
delete_object_called.append((args, kwargs))
|
||||
self.stubs.Set(swsync.objects.swiftclient,
|
||||
'delete_object', delete_object)
|
||||
|
||||
def head_container(*args, **kwargs):
|
||||
return True
|
||||
self.stubs.Set(swiftclient, 'head_container', head_container)
|
||||
|
||||
def get_container(*args, **kwargs):
|
||||
# MASTER
|
||||
if not get_cnt_called:
|
||||
cont = fakes.CONTAINERS_LIST[0][0]
|
||||
objects = list(fakes.CONTAINERS_LIST[0][1])
|
||||
get_cnt_called.append(True)
|
||||
# TARGET
|
||||
else:
|
||||
cont = fakes.CONTAINERS_LIST[0][0]
|
||||
objects = list(fakes.CONTAINERS_LIST[0][1])
|
||||
# Add an object to target.
|
||||
objects.append(fakes.gen_object('NEWOBJ'))
|
||||
|
||||
return (cont, objects)
|
||||
|
||||
def sync_object(*args, **kwargs):
|
||||
sync_object_called.append(args)
|
||||
|
||||
self.stubs.Set(swiftclient, 'get_container', get_container)
|
||||
|
||||
self.container_cls.sync_object = sync_object
|
||||
|
||||
self.container_cls.sync(
|
||||
self.orig_storage_cnx,
|
||||
self.orig_storage_url,
|
||||
'token',
|
||||
self.dest_storage_cnx,
|
||||
self.dest_storage_url,
|
||||
'token',
|
||||
'cont1')
|
||||
|
||||
self.assertEqual(len(sync_object_called), 0)
|
||||
self.assertEqual(len(delete_object_called), 1)
|
||||
|
||||
def test_sync(self):
|
||||
get_cnt_called = []
|
||||
sync_object_called = []
|
||||
|
||||
def head_container(*args, **kwargs):
|
||||
pass
|
||||
|
||||
def get_container(*args, **kwargs):
|
||||
# MASTER
|
||||
if not get_cnt_called:
|
||||
cont = fakes.CONTAINERS_LIST[0][0]
|
||||
objects = list(fakes.CONTAINERS_LIST[0][1])
|
||||
objects.append(fakes.gen_object('NEWOBJ'))
|
||||
get_cnt_called.append(True)
|
||||
# TARGET
|
||||
else:
|
||||
cont = fakes.CONTAINERS_LIST[0][0]
|
||||
objects = list(fakes.CONTAINERS_LIST[0][1])
|
||||
|
||||
return (cont, objects)
|
||||
|
||||
def sync_object(*args, **kwargs):
|
||||
sync_object_called.append(args)
|
||||
|
||||
self.stubs.Set(swiftclient, 'head_container', head_container)
|
||||
self.stubs.Set(swiftclient, 'get_container', get_container)
|
||||
self.container_cls.sync_object = sync_object
|
||||
|
||||
self.container_cls.sync(
|
||||
self.orig_storage_cnx,
|
||||
self.orig_storage_url,
|
||||
'token',
|
||||
self.dest_storage_cnx,
|
||||
self.dest_storage_url,
|
||||
'token',
|
||||
'cont1')
|
||||
|
||||
self.assertEqual(sync_object_called[0][-1][1], 'NEWOBJ')
|
||||
|
||||
def test_sync_raise_exceptions_get_container_on_orig(self):
|
||||
called = []
|
||||
|
||||
def get_container(*args, **kwargs):
|
||||
called.append("TESTED")
|
||||
raise swiftclient.client.ClientException("TESTED")
|
||||
|
||||
self.stubs.Set(swiftclient, 'get_container', get_container)
|
||||
self.container_cls.sync(
|
||||
self.orig_storage_cnx,
|
||||
self.orig_storage_url,
|
||||
'token',
|
||||
self.dest_storage_cnx,
|
||||
self.dest_storage_url,
|
||||
'token',
|
||||
'cont1')
|
||||
self.assertEqual(len(called), 1)
|
||||
|
||||
def test_sync_raise_exceptions_get_container_on_dest(self):
|
||||
called = []
|
||||
called_on_dest = []
|
||||
|
||||
def get_container(*args, **kwargs):
|
||||
#ORIG
|
||||
if len(called) == 0:
|
||||
called.append("TESTED")
|
||||
return ({}, [{'name': 'PARISESTMAGIQUE',
|
||||
'last_modified': '2010'}])
|
||||
#DEST
|
||||
else:
|
||||
called_on_dest.append("TESTED")
|
||||
raise swiftclient.client.ClientException("TESTED")
|
||||
|
||||
def head_container(*args, **kwargs):
|
||||
pass
|
||||
|
||||
self.stubs.Set(swiftclient, 'head_container', head_container)
|
||||
self.stubs.Set(swiftclient, 'get_container', get_container)
|
||||
self.container_cls.sync(
|
||||
self.orig_storage_cnx,
|
||||
self.orig_storage_url,
|
||||
'token',
|
||||
self.dest_storage_cnx,
|
||||
self.dest_storage_url,
|
||||
'token',
|
||||
'cont1')
|
||||
self.assertEqual(len(called_on_dest), 1)
|
||||
self.assertEqual(len(called), 1)
|
||||
|
||||
def test_delete_container(self):
|
||||
delete_called = []
|
||||
orig_containers = [{'name': 'foo'}]
|
||||
dest_containers = [{'name': 'foo'}, {'name': 'bar'}]
|
||||
|
||||
def get_container(*args, **kwargs):
|
||||
return ({}, [{'name': 'PARISESTMAGIQUE', 'last_modified': '2010'}])
|
||||
|
||||
def delete(*args, **kwargs):
|
||||
delete_called.append("TESTED")
|
||||
|
||||
self.container_cls.delete_object = delete
|
||||
self.stubs.Set(swiftclient, 'delete_container', delete)
|
||||
self.stubs.Set(swiftclient, 'get_container', get_container)
|
||||
|
||||
self.container_cls.delete_container(
|
||||
"cnx1", "token1", orig_containers, dest_containers)
|
||||
|
||||
self.assertEqual(len(delete_called), 2)
|
||||
|
||||
def test_delete_container_raise_exception(self):
|
||||
called = []
|
||||
orig_containers = [{'name': 'foo'}]
|
||||
dest_containers = [{'name': 'foo'}, {'name': 'bar'}]
|
||||
|
||||
def get_container(*args, **kwargs):
|
||||
called.append("TESTED")
|
||||
raise swiftclient.client.ClientException("TESTED")
|
||||
|
||||
self.stubs.Set(swiftclient, 'get_container', get_container)
|
||||
|
||||
self.container_cls.delete_container(
|
||||
"cnx1", "token1", orig_containers, dest_containers)
|
||||
|
||||
self.assertEqual(len(called), 1)
|
@ -1,288 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Fabien Boucher <fabien.boucher@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import eventlet
|
||||
import swiftclient
|
||||
|
||||
from keystoneclient.exceptions import ClientException as KSClientException
|
||||
|
||||
from fakes import FakeKSClient
|
||||
from fakes import FakeKSTenant
|
||||
from fakes import FakeKSUser
|
||||
from fakes import FakeSWConnection
|
||||
|
||||
from tests.units import base
|
||||
|
||||
from swsync import filler
|
||||
from swsync import utils
|
||||
|
||||
|
||||
class TestFiller(base.TestCase):
|
||||
def setUp(self):
|
||||
super(TestFiller, self).setUp()
|
||||
self._stubs()
|
||||
|
||||
def _stubs(self):
|
||||
self.stubs.Set(swiftclient.client, 'Connection',
|
||||
FakeSWConnection)
|
||||
|
||||
def get_connection(self, *args):
|
||||
return swiftclient.client.Connection(utils.get_config(
|
||||
'auth', 'keystone_origin'),
|
||||
'test', 'password',
|
||||
tenant_name='test')
|
||||
|
||||
def test_create_containers(self):
|
||||
get_containers_created = []
|
||||
return_dict_ref = {}
|
||||
|
||||
def put_container(*args, **kwargs):
|
||||
get_containers_created.append(args[1])
|
||||
|
||||
self.stubs.Set(FakeSWConnection, 'put_container', put_container)
|
||||
cnx = self.get_connection()
|
||||
filler.create_containers(cnx, 'test', 3, return_dict_ref)
|
||||
self.assertEqual(len(get_containers_created), 3)
|
||||
self.assertEqual(get_containers_created[0].split('_')[0],
|
||||
'container')
|
||||
meta_amount = len(return_dict_ref['test'].values())
|
||||
self.assertEqual(meta_amount, 3)
|
||||
|
||||
def test_create_containers_fail(self):
|
||||
get_containers_created = []
|
||||
return_dict_ref = {}
|
||||
self.attempts = 0
|
||||
|
||||
def put_container(*args, **kwargs):
|
||||
if self.attempts == 0:
|
||||
self.attempts += 1
|
||||
raise swiftclient.client.ClientException('Fake err msg')
|
||||
else:
|
||||
self.attempts += 1
|
||||
get_containers_created.append(args[1])
|
||||
|
||||
self.stubs.Set(FakeSWConnection, 'put_container', put_container)
|
||||
cnx = self.get_connection()
|
||||
filler.create_containers(cnx, 'test', 3, return_dict_ref)
|
||||
|
||||
self.assertEqual(len(get_containers_created), 2)
|
||||
|
||||
def test_create_objects(self):
|
||||
get_object_created = []
|
||||
return_dict_ref = {'test': {'container_a': {'objects': []},
|
||||
'container_b': {'objects': []}}}
|
||||
|
||||
def put_object(*args, **kwargs):
|
||||
get_object_created.append(args[1:])
|
||||
|
||||
self.stubs.Set(FakeSWConnection,
|
||||
'put_object',
|
||||
put_object)
|
||||
cnx = self.get_connection()
|
||||
filler.create_objects(cnx, 'test', 2, 2048, return_dict_ref)
|
||||
objects_ca = return_dict_ref['test']['container_a']['objects']
|
||||
objects_cb = return_dict_ref['test']['container_b']['objects']
|
||||
self.assertEqual(len(objects_ca), 2)
|
||||
self.assertEqual(len(objects_cb), 2)
|
||||
|
||||
def test_create_objects_fail(self):
|
||||
get_object_created = []
|
||||
return_dict_ref = {'test': {'container_a': {'objects': []}}}
|
||||
self.attempts = 0
|
||||
|
||||
def put_object(*args, **kwargs):
|
||||
if self.attempts == 0:
|
||||
self.attempts += 1
|
||||
raise swiftclient.client.ClientException('Fake err msg')
|
||||
else:
|
||||
self.attempts += 1
|
||||
get_object_created.append(args[1:])
|
||||
|
||||
self.stubs.Set(FakeSWConnection,
|
||||
'put_object',
|
||||
put_object)
|
||||
cnx = self.get_connection()
|
||||
filler.create_objects(cnx, 'test', 2, 2048, return_dict_ref)
|
||||
objects_ca = return_dict_ref['test']['container_a']['objects']
|
||||
self.assertEqual(len(objects_ca), 1)
|
||||
|
||||
def test_fill_swift(self):
|
||||
self.cont_cnt = 0
|
||||
self.obj_cnt = 0
|
||||
return_dict_ref = {}
|
||||
|
||||
def create_objects(*args, **kwargs):
|
||||
self.obj_cnt += 1
|
||||
|
||||
def create_containers(*args, **kwargs):
|
||||
self.cont_cnt += 1
|
||||
|
||||
def swift_cnx(*args, **kargs):
|
||||
return self.get_connection()
|
||||
|
||||
self.stubs.Set(filler, 'swift_cnx', swift_cnx)
|
||||
self.stubs.Set(filler, 'create_objects', create_objects)
|
||||
self.stubs.Set(filler, 'create_containers', create_containers)
|
||||
|
||||
concurrency = int(utils.get_config('concurrency',
|
||||
'filler_swift_client_concurrency'))
|
||||
pool = eventlet.GreenPool(concurrency)
|
||||
|
||||
created = {('account1', 'account1_id'): ['test', 'test_id', 'role_id'],
|
||||
('account2', 'account2_id'): ['test', 'test_id', 'role_id']}
|
||||
filler.fill_swift(pool, created, 1, 1, 2048, return_dict_ref)
|
||||
self.assertEqual(self.cont_cnt, 2)
|
||||
self.assertEqual(self.obj_cnt, 2)
|
||||
|
||||
def test_create_swift_user(self):
|
||||
self.create_cnt = 0
|
||||
self.role_cnt = 0
|
||||
|
||||
def create(*args, **kargs):
|
||||
self.create_cnt += 1
|
||||
return FakeKSUser()
|
||||
|
||||
def add_user_role(*args, **kargs):
|
||||
self.role_cnt += 1
|
||||
|
||||
co = utils.get_config('auth',
|
||||
'keystone_origin_admin_credentials').split(':')
|
||||
tenant_name, username, password = co
|
||||
client = FakeKSClient()
|
||||
client.roles.add_user_role = add_user_role
|
||||
client.users.create = create
|
||||
filler.create_swift_user(client, 'account1', 'account1_id', 1)
|
||||
|
||||
self.assertEqual(self.create_cnt, 1)
|
||||
self.assertEqual(self.role_cnt, 1)
|
||||
|
||||
def test_create_swift_user_fail(self):
|
||||
self.pa = 0
|
||||
|
||||
def create(*args, **kargs):
|
||||
if self.pa == 0:
|
||||
self.pa += 1
|
||||
raise KSClientException('Fake msg')
|
||||
else:
|
||||
self.pa += 1
|
||||
return FakeKSUser()
|
||||
|
||||
def add_user_role(*args, **kargs):
|
||||
pass
|
||||
|
||||
co = utils.get_config('auth',
|
||||
'keystone_origin_admin_credentials').split(':')
|
||||
tenant_name, username, password = co
|
||||
client = FakeKSClient()
|
||||
client.roles.add_user_role = add_user_role
|
||||
client.users.create = create
|
||||
users = filler.create_swift_user(client, 'account1', 'account1_id', 3)
|
||||
|
||||
self.assertEqual(len(users), 2)
|
||||
|
||||
def test_create_swift_account(self):
|
||||
self.ret_index = {}
|
||||
self.user_cnt = 0
|
||||
|
||||
def create_swift_user(*args):
|
||||
self.user_cnt += 1
|
||||
|
||||
self.stubs.Set(filler, 'create_swift_user', create_swift_user)
|
||||
|
||||
concurrency = int(utils.get_config('concurrency',
|
||||
'filler_keystone_client_concurrency'))
|
||||
pile = eventlet.GreenPile(concurrency)
|
||||
client = FakeKSClient()
|
||||
filler.create_swift_account(client, pile, 1, 1, self.ret_index)
|
||||
|
||||
self.assertEqual(self.user_cnt, 1)
|
||||
self.assertEqual(len(self.ret_index.keys()), 1)
|
||||
|
||||
def test_create_swift_account_fail(self):
|
||||
self.ret_index = {}
|
||||
self.pa = 0
|
||||
|
||||
def create_tenant(*args):
|
||||
if self.pa == 0:
|
||||
self.pa += 1
|
||||
raise KSClientException('Fake msg')
|
||||
else:
|
||||
self.pa += 1
|
||||
return FakeKSTenant('foo1')
|
||||
|
||||
def create_swift_user(*args):
|
||||
pass
|
||||
|
||||
client = FakeKSClient()
|
||||
|
||||
self.stubs.Set(client.tenants, 'create', create_tenant)
|
||||
self.stubs.Set(filler, 'create_swift_user', create_swift_user)
|
||||
|
||||
concurrency = int(utils.get_config('concurrency',
|
||||
'filler_keystone_client_concurrency'))
|
||||
pile = eventlet.GreenPile(concurrency)
|
||||
filler.create_swift_account(client, pile, 3, 1, self.ret_index)
|
||||
|
||||
self.assertEqual(len(self.ret_index.keys()), 2)
|
||||
|
||||
def test_delete_account(self):
|
||||
self.delete_t_cnt = 0
|
||||
self.delete_u_cnt = 0
|
||||
|
||||
def delete_t(*args):
|
||||
self.delete_t_cnt += 1
|
||||
|
||||
def delete_u(*args):
|
||||
self.delete_u_cnt += 1
|
||||
|
||||
client = FakeKSClient()
|
||||
client.tenants.delete = delete_t
|
||||
client.users.delete = delete_u
|
||||
filler.delete_account(client,
|
||||
[FakeKSUser().id, ],
|
||||
('account1', 'account1_id'))
|
||||
|
||||
self.assertEqual(self.delete_t_cnt, 1)
|
||||
self.assertEqual(self.delete_u_cnt, 1)
|
||||
|
||||
def test_delete_account_content(self):
|
||||
self.cnt_ga = 0
|
||||
self.cnt_co = 0
|
||||
self.cnt_do = 0
|
||||
|
||||
filler.swift_cnx = self.get_connection
|
||||
|
||||
def get_account(*args, **kwargs):
|
||||
self.cnt_ga += 1
|
||||
return (None, ({'name': 'cont1'}, {'name': 'cont2'}))
|
||||
|
||||
def get_container(*args, **kwargs):
|
||||
self.cnt_co += 1
|
||||
return (None, ({'name': 'obj1'}, {'name': 'obj2'}))
|
||||
|
||||
def delete_object(*args, **kwargs):
|
||||
self.cnt_do += 1
|
||||
|
||||
self.stubs.Set(FakeSWConnection, 'get_account', get_account)
|
||||
self.stubs.Set(FakeSWConnection, 'get_container', get_container)
|
||||
self.stubs.Set(FakeSWConnection, 'delete_object', delete_object)
|
||||
|
||||
filler.delete_account_content('account1', ['user', 'user_id'])
|
||||
|
||||
self.assertEqual(self.cnt_ga, 1)
|
||||
self.assertEqual(self.cnt_co, 2)
|
||||
self.assertEqual(self.cnt_do, 4)
|
@ -1,156 +0,0 @@
|
||||
# -*- encoding: utf-8 -*-
|
||||
|
||||
# Copyright 2013 eNovance.
|
||||
# All Rights Reserved.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Author : "Fabien Boucher <fabien.boucher@enovance.com>"
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
|
||||
import unittest
|
||||
|
||||
from middlewares import last_modified as middleware
|
||||
import swift.common.swob as swob
|
||||
|
||||
|
||||
class FakeApp(object):
|
||||
def __init__(self, status_headers_body=None):
|
||||
self.status_headers_body = status_headers_body
|
||||
if not self.status_headers_body:
|
||||
self.status_headers_body = ('204 No Content', {}, '')
|
||||
|
||||
def __call__(self, env, start_response):
|
||||
status, headers, body = self.status_headers_body
|
||||
return swob.Response(status=status, headers=headers,
|
||||
body=body)(env, start_response)
|
||||
|
||||
|
||||
class FakeRequest(object):
|
||||
def get_response(self, app):
|
||||
pass
|
||||
|
||||
|
||||
class TestLastModifiedMiddleware(unittest.TestCase):
|
||||
|
||||
def _make_request(self, path, **kwargs):
|
||||
req = swob.Request.blank("/v1/AUTH_account/%s" % path, **kwargs)
|
||||
return req
|
||||
|
||||
def setUp(self):
|
||||
self.conf = {'key_name': 'Last-Modified'}
|
||||
self.test_default = middleware.filter_factory(self.conf)(FakeApp())
|
||||
|
||||
def test_denied_method_conf(self):
|
||||
app = FakeApp()
|
||||
test = middleware.filter_factory({})(app)
|
||||
self.assertEqual(test.key_name, 'Last-Modified')
|
||||
test = middleware.filter_factory({'key_name': "Last Modified"})(app)
|
||||
self.assertEqual(test.key_name, 'Last-Modified')
|
||||
test = middleware.filter_factory({'key_name': "Custom Key"})(app)
|
||||
self.assertEqual(test.key_name, 'Custom-Key')
|
||||
|
||||
def test_PUT_on_container(self):
|
||||
self.called = False
|
||||
|
||||
def make_pre_authed_request(*args, **kargs):
|
||||
self.called = True
|
||||
return FakeRequest()
|
||||
|
||||
middleware.wsgi.make_pre_authed_request = make_pre_authed_request
|
||||
req = self._make_request('cont',
|
||||
environ={'REQUEST_METHOD': 'PUT'})
|
||||
req.get_response(self.test_default)
|
||||
self.assertEqual(self.called, True)
|
||||
|
||||
def test_POST_on_container(self):
|
||||
self.called = False
|
||||
|
||||
def make_pre_authed_request(*args, **kargs):
|
||||
self.called = True
|
||||
return FakeRequest()
|
||||
|
||||
middleware.wsgi.make_pre_authed_request = make_pre_authed_request
|
||||
req = self._make_request('cont',
|
||||
environ={'REQUEST_METHOD': 'POST'})
|
||||
req.get_response(self.test_default)
|
||||
self.assertEqual(self.called, True)
|
||||
|
||||
def test_DELETE_on_container(self):
|
||||
self.called = False
|
||||
|
||||
def make_pre_authed_request(*args, **kargs):
|
||||
self.called = True
|
||||
return FakeRequest()
|
||||
|
||||
middleware.wsgi.make_pre_authed_request = make_pre_authed_request
|
||||
req = self._make_request('cont',
|
||||
environ={'REQUEST_METHOD': 'DELETE'})
|
||||
req.get_response(self.test_default)
|
||||
self.assertEqual(self.called, False)
|
||||
|
||||
def test_GET_on_container_and_object(self):
|
||||
self.called = False
|
||||
|
||||
def make_pre_authed_request(*args, **kargs):
|
||||
self.called = True
|
||||
return FakeRequest()
|
||||
|
||||
middleware.wsgi.make_pre_authed_request = make_pre_authed_request
|
||||
req = self._make_request('cont',
|
||||
environ={'REQUEST_METHOD': 'GET'})
|
||||
req.get_response(self.test_default)
|
||||
self.assertEqual(self.called, False)
|
||||
self.called = False
|
||||
req = self._make_request('cont/obj',
|
||||
environ={'REQUEST_METHOD': 'GET'})
|
||||
req.get_response(self.test_default)
|
||||
self.assertEqual(self.called, False)
|
||||
|
||||
def test_POST_on_object(self):
|
||||
self.called = False
|
||||
|
||||
def make_pre_authed_request(*args, **kargs):
|
||||
self.called = True
|
||||
return FakeRequest()
|
||||
|
||||
middleware.wsgi.make_pre_authed_request = make_pre_authed_request
|
||||
req = self._make_request('cont/obj',
|
||||
environ={'REQUEST_METHOD': 'POST'})
|
||||
req.get_response(self.test_default)
|
||||
self.assertEqual(self.called, True)
|
||||
|
||||
def test_PUT_on_object(self):
|
||||
self.called = False
|
||||
|
||||
def make_pre_authed_request(*args, **kargs):
|
||||
self.called = True
|
||||
return FakeRequest()
|
||||
|
||||
middleware.wsgi.make_pre_authed_request = make_pre_authed_request
|
||||
req = self._make_request('cont/obj',
|
||||
environ={'REQUEST_METHOD': 'PUT'})
|
||||
req.get_response(self.test_default)
|
||||
self.assertEqual(self.called, True)
|
||||
|
||||
def test_DELETE_on_object(self):
|
||||
self.called = False
|
||||
|
||||
def make_pre_authed_request(*args, **kargs):
|
||||
self.called = True
|
||||
return FakeRequest()
|
||||
|
||||
middleware.wsgi.make_pre_authed_request = make_pre_authed_request
|
||||
req = self._make_request('cont/obj',
|
||||
environ={'REQUEST_METHOD': 'DELETE'})
|
||||
req.get_response(self.test_default)
|
||||
self.assertEqual(self.called, True)
|
@ -1,179 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import eventlet
|
||||
import swift
|
||||
try:
|
||||
from swift.container.sync import _Iter2FileLikeObject as FileLikeIter
|
||||
except ImportError:
|
||||
# Nov2013: swift.common.utils now include a more generic object
|
||||
from swift.common.utils import FileLikeIter
|
||||
|
||||
import swiftclient
|
||||
|
||||
import swsync.objects as swobjects
|
||||
import tests.units.base as test_base
|
||||
import tests.units.fakes as fakes
|
||||
|
||||
|
||||
def fake_http_connect(status, body='', headers={}, resp_waitfor=None,
|
||||
connect_waitfor=None):
|
||||
class FakeConn(object):
|
||||
def __init__(self, status):
|
||||
self.reason = 'PSG'
|
||||
self.status = status
|
||||
self.body = body
|
||||
if connect_waitfor:
|
||||
eventlet.sleep(int(connect_waitfor))
|
||||
|
||||
def getheaders(self):
|
||||
return headers
|
||||
|
||||
def getresponse(self):
|
||||
if resp_waitfor:
|
||||
eventlet.sleep(int(resp_waitfor))
|
||||
return self
|
||||
|
||||
def read(self, amt=None):
|
||||
rv = self.body[:amt]
|
||||
self.body = self.body[amt:]
|
||||
return rv
|
||||
|
||||
def connect(*args, **kwargs):
|
||||
return FakeConn(status)
|
||||
return connect
|
||||
|
||||
|
||||
class TestObject(test_base.TestCase):
|
||||
def setUp(self):
|
||||
super(TestObject, self).setUp()
|
||||
self.tenant_name = 'foo1'
|
||||
self.tenant_id = fakes.TENANTS_LIST[self.tenant_name]['id']
|
||||
self.orig_storage_url = "%s/AUTH_%s" % (fakes.STORAGE_ORIG,
|
||||
self.tenant_id)
|
||||
self.dest_storage_url = "%s/AUTH_%s" % (fakes.STORAGE_DEST,
|
||||
self.tenant_id)
|
||||
|
||||
def test_quote(self):
|
||||
utf8_chars = u'\uF10F\uD20D\uB30B\u9409\u8508\u5605\u3703\u1801'
|
||||
try:
|
||||
swobjects.quote(utf8_chars)
|
||||
except(KeyError):
|
||||
self.fail("utf8 was not properly quoted")
|
||||
|
||||
def test_get_object_not_found(self):
|
||||
new_connect = fake_http_connect(404)
|
||||
self.stubs.Set(swift.common.bufferedhttp,
|
||||
'http_connect_raw', new_connect)
|
||||
|
||||
self.assertRaises(swiftclient.ClientException,
|
||||
swobjects.get_object,
|
||||
self.orig_storage_url, "token", "cont1", "obj1")
|
||||
|
||||
def test_sync_object(self):
|
||||
body = ("X" * 3) * 1024
|
||||
new_connect = fake_http_connect(200, body)
|
||||
self.stubs.Set(swift.common.bufferedhttp,
|
||||
'http_connect_raw', new_connect)
|
||||
|
||||
def put_object(url, name=None, headers=None, contents=None):
|
||||
self.assertEqual('obj1', name)
|
||||
self.assertIn('x-auth-token', headers)
|
||||
self.assertIsInstance(contents, FileLikeIter)
|
||||
contents_read = contents.read()
|
||||
self.assertEqual(len(contents_read), len(body))
|
||||
|
||||
self.stubs.Set(swobjects.swiftclient, 'put_object', put_object)
|
||||
|
||||
swobjects.sync_object(self.orig_storage_url,
|
||||
"token", self.dest_storage_url, "token",
|
||||
"cont1", ("etag", "obj1"))
|
||||
|
||||
def test_sync_object_utf8(self):
|
||||
utf_obj = "யாமறிந்த"
|
||||
body = "FOO"
|
||||
new_connect = fake_http_connect(200, body)
|
||||
self.stubs.Set(swift.common.bufferedhttp,
|
||||
'http_connect_raw', new_connect)
|
||||
|
||||
def put_object(url, name=None, headers=None, contents=None):
|
||||
# Container is Quoted
|
||||
self.assertFalse(isinstance(url.split("/")[-1], unicode))
|
||||
self.assertEqual(utf_obj, name)
|
||||
|
||||
self.stubs.Set(swobjects.swiftclient, 'put_object', put_object)
|
||||
|
||||
swobjects.sync_object(self.orig_storage_url,
|
||||
"token", self.dest_storage_url, "token",
|
||||
"contגלאָז", ("etag", utf_obj))
|
||||
|
||||
def test_get_object_chunked(self):
|
||||
chunk_size = 32
|
||||
expected_chunk_time = 3
|
||||
body = ("X" * expected_chunk_time) * chunk_size
|
||||
|
||||
new_connect = fake_http_connect(200, body)
|
||||
self.stubs.Set(swift.common.bufferedhttp,
|
||||
'http_connect_raw', new_connect)
|
||||
|
||||
headers, gen = swobjects.get_object(self.orig_storage_url,
|
||||
"token", "cont1", "obj1",
|
||||
resp_chunk_size=chunk_size)
|
||||
sent_time = 0
|
||||
for chunk in gen:
|
||||
sent_time += 1
|
||||
self.assertEqual(sent_time, expected_chunk_time)
|
||||
|
||||
def test_get_object_full(self):
|
||||
new_connect = fake_http_connect(200, body='foobar')
|
||||
self.stubs.Set(swift.common.bufferedhttp,
|
||||
'http_connect_raw', new_connect)
|
||||
|
||||
headers, body = swobjects.get_object(self.orig_storage_url,
|
||||
"token", "cont1", "obj1",
|
||||
resp_chunk_size=None)
|
||||
self.assertEqual(body, 'foobar')
|
||||
|
||||
def test_get_headers(self):
|
||||
headers = {'X-FOO': 'BaR'}.items()
|
||||
new_connect = fake_http_connect(200, headers=headers)
|
||||
self.stubs.Set(swift.common.bufferedhttp,
|
||||
'http_connect_raw', new_connect)
|
||||
|
||||
headers, gen = swobjects.get_object(self.orig_storage_url,
|
||||
"token",
|
||||
"cont1",
|
||||
"obj1")
|
||||
self.assertIn('x-foo', headers)
|
||||
self.assertEqual(headers['x-foo'], 'BaR')
|
||||
|
||||
def test_get_object_over_conn_timeout(self):
|
||||
new_connect = fake_http_connect(200, connect_waitfor=2)
|
||||
self.stubs.Set(swift.common.bufferedhttp,
|
||||
'http_connect_raw', new_connect)
|
||||
self.assertRaises(eventlet.Timeout,
|
||||
swobjects.get_object,
|
||||
self.orig_storage_url, "token", "cont1", "obj1",
|
||||
conn_timeout=1)
|
||||
|
||||
def test_get_object_over_resp_timeout(self):
|
||||
new_connect = fake_http_connect(200, resp_waitfor=2)
|
||||
self.stubs.Set(swift.common.bufferedhttp,
|
||||
'http_connect_raw', new_connect)
|
||||
self.assertRaises(eventlet.Timeout,
|
||||
swobjects.get_object,
|
||||
self.orig_storage_url, "token", "cont1", "obj1",
|
||||
response_timeout=1)
|
@ -1,72 +0,0 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import ConfigParser
|
||||
import cStringIO as StringIO
|
||||
|
||||
import swsync.utils
|
||||
import tests.units.base as test_base
|
||||
|
||||
|
||||
class TestAccount(test_base.TestCase):
|
||||
def test_parse_ini_file_not_found(self):
|
||||
self.stubs.Set(swsync.utils.os.path, 'exists',
|
||||
lambda x: False)
|
||||
self.assertRaises(swsync.utils.ConfigurationError,
|
||||
swsync.utils.parse_ini, "/tmp/foo")
|
||||
|
||||
def test_parse_ini_bad_file(self):
|
||||
s = StringIO.StringIO("foo=bar")
|
||||
self.assertRaises(ConfigParser.MissingSectionHeaderError,
|
||||
swsync.utils.parse_ini, s)
|
||||
|
||||
def test_parse_ini(self):
|
||||
s = StringIO.StringIO("[foo]\nfoo=bar")
|
||||
self.assertIsInstance(swsync.utils.parse_ini(s),
|
||||
ConfigParser.RawConfigParser)
|
||||
|
||||
def test_get_config(self):
|
||||
s = StringIO.StringIO("[foo]\nkey=bar")
|
||||
cfg = swsync.utils.parse_ini(s)
|
||||
self.assertEqual(swsync.utils.get_config('foo', 'key', _config=cfg),
|
||||
'bar')
|
||||
|
||||
def test_get_config_no_section(self):
|
||||
s = StringIO.StringIO("[pasla]\nkey=bar")
|
||||
cfg = swsync.utils.parse_ini(s)
|
||||
self.assertRaises(swsync.utils.ConfigurationError,
|
||||
swsync.utils.get_config,
|
||||
'foo', 'key', _config=cfg)
|
||||
|
||||
def test_get_config_with_default(self):
|
||||
s = StringIO.StringIO("[foo]\n")
|
||||
cfg = swsync.utils.parse_ini(s)
|
||||
self.assertEqual(swsync.utils.get_config('foo', 'key', default='MEME',
|
||||
_config=cfg),
|
||||
'MEME')
|
||||
|
||||
def test_get_config_auto_parsed(self):
|
||||
s = StringIO.StringIO("[foo]\nkey=bar")
|
||||
cfg = swsync.utils.parse_ini(s)
|
||||
self.stubs.Set(swsync.utils, 'CONFIG', cfg)
|
||||
self.assertEqual(swsync.utils.get_config('foo', 'key'), 'bar')
|
||||
|
||||
def test_get_config_no_value(self):
|
||||
s = StringIO.StringIO("[foo]\n")
|
||||
cfg = swsync.utils.parse_ini(s)
|
||||
self.assertRaises(swsync.utils.ConfigurationError,
|
||||
swsync.utils.get_config,
|
||||
'foo', 'key', _config=cfg)
|
@ -1,78 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
import argparse
|
||||
import sys
|
||||
import time
|
||||
|
||||
import keystoneclient.v2_0.client
|
||||
import swiftclient
|
||||
|
||||
import swsync.utils
|
||||
|
||||
MAX_RETRIES = 5
|
||||
|
||||
|
||||
def main():
|
||||
"""Delete some accounts."""
|
||||
parser = argparse.ArgumentParser(add_help=True)
|
||||
parser.add_argument('-d', action='store_true',
|
||||
dest="dest",
|
||||
help='Check destination')
|
||||
parser.add_argument('number', nargs=1, type=int)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
print "Are you sure you want to delete? Control-C if you don't"
|
||||
time.sleep(5)
|
||||
number = args.number[0]
|
||||
|
||||
credentials = swsync.utils.get_config('auth',
|
||||
'keystone_origin_admin_credentials')
|
||||
(tenant_name, username, password) = credentials.split(':')
|
||||
if args.dest:
|
||||
auth_url = swsync.utils.get_config('auth', 'keystone_dest')
|
||||
else:
|
||||
auth_url = swsync.utils.get_config('auth', 'keystone_origin')
|
||||
keystone_cnx = keystoneclient.v2_0.client.Client(auth_url=auth_url,
|
||||
username=username,
|
||||
password=password,
|
||||
tenant_name=tenant_name)
|
||||
|
||||
storage_url, admin_token = swiftclient.client.Connection(
|
||||
auth_url, '%s:%s' % (tenant_name, username), password,
|
||||
auth_version=2).get_auth()
|
||||
bare_storage_url = storage_url[:storage_url.find('AUTH_')] + "AUTH_"
|
||||
|
||||
TENANT_LIST = keystone_cnx.tenants.list()
|
||||
mid = int(len(TENANT_LIST) / 2)
|
||||
for tenant in TENANT_LIST[mid:mid + number]:
|
||||
tenant_storage_url = bare_storage_url + tenant.id
|
||||
swiftcnx = swiftclient.client.Connection(preauthurl=tenant_storage_url,
|
||||
preauthtoken=admin_token,
|
||||
retries=MAX_RETRIES)
|
||||
_, containers = swiftcnx.get_account()
|
||||
for cont in containers:
|
||||
_, objects = swiftcnx.get_container(cont['name'])
|
||||
print "deleting %s" % (cont['name'])
|
||||
for obj in objects:
|
||||
print "deleting %s/%s" % (cont['name'], obj['name'])
|
||||
swiftcnx.delete_object(cont['name'], obj['name'])
|
||||
swiftcnx.delete_container(cont['name'])
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
@ -1,8 +0,0 @@
|
||||
simplejson
|
||||
http://tarballs.openstack.org/swift/swift-master.tar.gz#egg=swift
|
||||
python-swiftclient
|
||||
python-dateutil
|
||||
netifaces
|
||||
python-keystoneclient
|
||||
eventlet
|
||||
pastedeploy>=1.3.3
|
@ -1,109 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
# Copyright (C) 2013 eNovance SAS <licensing@enovance.com>
|
||||
#
|
||||
# Author: Chmouel Boudjnah <chmouel@enovance.com>
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License"); you may
|
||||
# not use this file except in compliance with the License. You may obtain
|
||||
# a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
# License for the specific language governing permissions and limitations
|
||||
# under the License.
|
||||
"""
|
||||
Simple script to see a global swift cluster usage querying keystone server.
|
||||
"""
|
||||
import argparse
|
||||
|
||||
import keystoneclient.v2_0.client
|
||||
import swiftclient
|
||||
|
||||
import swsync.utils
|
||||
|
||||
MAX_RETRIES = 10
|
||||
|
||||
# Nicer filesize reporting make it optional
|
||||
try:
|
||||
import hurry.filesize
|
||||
prettysize = hurry.filesize.size
|
||||
except ImportError:
|
||||
prettysize = None
|
||||
|
||||
|
||||
def get_swift_auth(auth_url, tenant, user, password):
|
||||
"""Get swift connexion from args."""
|
||||
return swiftclient.client.Connection(
|
||||
auth_url,
|
||||
'%s:%s' % (tenant, user),
|
||||
password,
|
||||
auth_version=2).get_auth()
|
||||
|
||||
|
||||
def get_ks_auth_orig():
|
||||
"""Get keystone cnx from config."""
|
||||
orig_auth_url = swsync.utils.get_config('auth', 'keystone_origin')
|
||||
cfg = swsync.utils.get_config('auth', 'keystone_origin_admin_credentials')
|
||||
(tenant_name, username, password) = cfg.split(':')
|
||||
|
||||
return keystoneclient.v2_0.client.Client(auth_url=orig_auth_url,
|
||||
username=username,
|
||||
password=password,
|
||||
tenant_name=tenant_name)
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(add_help=True)
|
||||
parser.add_argument('-d', action='store_true',
|
||||
dest="dest",
|
||||
help='Check destination')
|
||||
parser.add_argument('-r', action='store_true',
|
||||
dest="raw_output",
|
||||
help='No human output')
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
keystone_cnx = get_ks_auth_orig()
|
||||
if args.dest:
|
||||
auth_url = swsync.utils.get_config('auth', 'keystone_dest')
|
||||
else:
|
||||
auth_url = swsync.utils.get_config('auth', 'keystone_origin')
|
||||
credentials = swsync.utils.get_config(
|
||||
'auth', 'keystone_origin_admin_credentials')
|
||||
tenant, admin_user, admin_password = (credentials.split(':'))
|
||||
|
||||
storage_url, token = get_swift_auth(
|
||||
auth_url, tenant,
|
||||
admin_user, admin_password)
|
||||
|
||||
bare_storage_url = storage_url[:storage_url.find('AUTH_')] + "AUTH_"
|
||||
|
||||
total_size = 0
|
||||
total_containers = 0
|
||||
total_objects = 0
|
||||
for tenant in keystone_cnx.tenants.list():
|
||||
tenant_storage_url = bare_storage_url + tenant.id
|
||||
cnx = swiftclient.client.Connection(preauthurl=tenant_storage_url,
|
||||
preauthtoken=token,
|
||||
retries=MAX_RETRIES)
|
||||
try:
|
||||
head = cnx.head_account()
|
||||
# TOO BUSY
|
||||
except(swiftclient.client.ClientException):
|
||||
continue
|
||||
total_size += int(head['x-account-bytes-used'])
|
||||
total_containers += int(head['x-account-container-count'])
|
||||
total_objects += int(head['x-account-object-count'])
|
||||
|
||||
size = (prettysize and not args.raw_output) and \
|
||||
prettysize(total_size) or total_size
|
||||
print "Total size: %s" % (size)
|
||||
print "Total containers: %d" % (total_containers)
|
||||
print "Total objects: %d" % (total_objects)
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
@ -1,10 +0,0 @@
|
||||
coverage
|
||||
discover
|
||||
distribute>=0.6.24
|
||||
mox
|
||||
nose
|
||||
nosehtmloutput
|
||||
testrepository>=0.0.13
|
||||
testtools>=0.9.22
|
||||
unittest2
|
||||
hacking
|
33
tox.ini
33
tox.ini
@ -1,33 +0,0 @@
|
||||
[tox]
|
||||
envlist = py27,pep8
|
||||
|
||||
[testenv]
|
||||
setenv = VIRTUAL_ENV={envdir}
|
||||
NOSE_WITH_OPENSTACK=1
|
||||
NOSE_OPENSTACK_COLOR=1
|
||||
NOSE_OPENSTACK_RED=0.05
|
||||
NOSE_OPENSTACK_YELLOW=0.025
|
||||
NOSE_OPENSTACK_SHOW_ELAPSED=1
|
||||
NOSE_OPENSTACK_STDOUT=1
|
||||
|
||||
deps = -r{toxinidir}/tools/pip-requires
|
||||
-r{toxinidir}/tools/test-requires
|
||||
commands = python setup.py testr --testr-args="{posargs}"
|
||||
|
||||
[testenv:pep8]
|
||||
sitepackages = False
|
||||
commands = flake8 --show-source swsync bin setup.py tests
|
||||
|
||||
[testenv:venv]
|
||||
commands = {posargs}
|
||||
|
||||
[testenv:cover]
|
||||
commands = python setup.py testr --coverage
|
||||
|
||||
[tox:jenkins]
|
||||
downloadcache = ~/cache/pip
|
||||
|
||||
[flake8]
|
||||
ignore = E12,E711,E721,E712,H302,H303,H403,H404,H803
|
||||
builtins = _
|
||||
exclude = .venv,.git,.tox,dist,doc,*openstack/common*,*lib/python*,*egg,build,plugins,tools
|
Loading…
x
Reference in New Issue
Block a user