Template for ModelMigrationSync for external repos

This section contains a template for a test which checks that the Python models for database tables are synchronized with the alembic migrations that create the database schema. This test should be implemented in all driver/plugin repositories that were split out from Neutron.

What does the test do?

This test compares models with the result of existing migrations. It is based on ModelsMigrationsSync which is provided by oslo.db and was adapted for Neutron. It compares core Neutron models and vendor specific models with migrations from Neutron core and migrations from the driver/plugin repo. This test is functional - it runs against MySQL and PostgreSQL dialects. The detailed description of this test can be found in Neutron Database Layer section - Tests to verify that database migrations and models are in sync.

Steps for implementing the test

1. Import all models in one place

Create a module networking_foo/db/models/head.py with the following content:

from neutron.db.migration.models import head
from networking_foo import models  # noqa
# Alternatively, import separate modules here if the models are not in one
# models.py file


def get_metadata():
   return head.model_base.BASEV2.metadata

2. Implement the test module

The test uses external.py from Neutron. This file contains lists of table names, which were moved out of Neutron:

VPNAAS_TABLES = [...]

LBAAS_TABLES = [...]

FWAAS_TABLES = [...]

# Arista ML2 driver Models moved to openstack/networking-arista
REPO_ARISTA_TABLES = [...]

# Models moved to openstack/networking-cisco
REPO_CISCO_TABLES = [...]

...

TABLES = (FWAAS_TABLES + LBAAS_TABLES + VPNAAS_TABLES + ...
          + REPO_ARISTA_TABLES + REPO_CISCO_TABLES)

Also the test uses VERSION_TABLE, it is the name of table in database which contains revision id of head migration. It is preferred to keep this variable in networking_foo/db/migration/alembic_migrations/__init__.py so it will be easy to use in test.

Create a module networking_foo/tests/functional/db/test_migrations.py with the following content:

from oslo_config import cfg

from neutron.db.migration.alembic_migrations import external
from neutron.db.migration import cli as migration
from neutron.tests.common import base
from neutron.tests.functional.db import test_migrations

from networking_foo.db.migration import alembic_migrations
from networking_foo.db.models import head

# EXTERNAL_TABLES should contain all names of tables that are not related to
# current repo.
EXTERNAL_TABLES = set(external.TABLES) - set(external.REPO_FOO_TABLES)


class _TestModelsMigrationsFoo(test_migrations._TestModelsMigrations):

  def db_sync(self, engine):
      cfg.CONF.set_override('connection', engine.url, group='database')
      for conf in migration.get_alembic_configs():
          self.alembic_config = conf
          self.alembic_config.neutron_config = cfg.CONF
          migration.do_alembic_command(conf, 'upgrade', 'heads')

  def get_metadata(self):
      return head.get_metadata()

  def include_object(self, object_, name, type_, reflected, compare_to):
      if type_ == 'table' and (name == 'alembic' or
                               name == alembic_migrations.VERSION_TABLE or
                               name in EXTERNAL_TABLES):
          return False
      else:
          return True


class TestModelsMigrationsMysql(_TestModelsMigrationsFoo,
                                base.MySQLTestCase):
   pass


class TestModelsMigrationsPsql(_TestModelsMigrationsFoo,
                               base.PostgreSQLTestCase):
   pass

3. Add functional requirements

A separate file networking_foo/tests/functional/requirements.txt should be created containing the following requirements that are needed for successful test execution.

psutil>=1.1.1,<2.0.0
psycopg2
PyMySQL>=0.6.2  # MIT License

Example implementation in VPNaaS