cubicweb/dataimport/test/test_massive_store.py
author Sylvain Thénault <sylvain.thenault@logilab.fr>
Mon, 01 Feb 2016 17:50:05 +0100
changeset 11329 a8cab8fb54ba
parent 11328 9f2d7da47526
child 11332 7187bf515251
permissions -rw-r--r--
[dataimport] drop massive store's flush_metadata method This method handle a temporary table which sounds useless, and even buggy in some cases (metadata for a single entity types will be only flushed once even if flush is called several time). Instead, simply call the method-doing-the job after entities have been inserted (and make it private along the way).
Ignore whitespace changes - Everywhere: Within whitespace: At end of lines:
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
     1
# -*- coding: utf-8 -*-
11310
e0b7277e5394 [dataimport] PGHelper should be responsible to retrieve the database schema
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11095
diff changeset
     2
# copyright 2013-2016 LOGILAB S.A. (Paris, FRANCE), all rights reserved.
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
     3
# contact http://www.logilab.fr -- mailto:contact@logilab.fr
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
     4
#
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
     5
# This program is free software: you can redistribute it and/or modify it under
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
     6
# the terms of the GNU Lesser General Public License as published by the Free
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
     7
# Software Foundation, either version 2.1 of the License, or (at your option)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
     8
# any later version.
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
     9
#
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    10
# This program is distributed in the hope that it will be useful, but WITHOUT
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    11
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    12
# FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    13
# details.
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    14
#
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    15
# You should have received a copy of the GNU Lesser General Public License along
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    16
# with this program. If not, see <http://www.gnu.org/licenses/>.
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    17
"""Massive store test case"""
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    18
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    19
import itertools
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    20
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    21
from cubicweb.devtools import testlib, PostgresApptestConfiguration
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    22
from cubicweb.devtools import startpgcluster, stoppgcluster
11328
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    23
from cubicweb.dataimport import ucsvreader, stores
11024
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
    24
from cubicweb.dataimport.massive_store import MassiveObjectStore, PGHelper
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    25
11328
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    26
from test_stores import NoHookRQLObjectStoreWithCustomMDGenStoreTC
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    27
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    28
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    29
def setUpModule():
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    30
    startpgcluster(__file__)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    31
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    32
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    33
def tearDownModule(*args):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    34
    stoppgcluster(__file__)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    35
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    36
11328
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    37
class MassiveObjectStoreWithCustomMDGenStoreTC(NoHookRQLObjectStoreWithCustomMDGenStoreTC):
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    38
    configcls = PostgresApptestConfiguration
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    39
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    40
    def store_impl(self, cnx):
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    41
        source = cnx.create_entity('CWSource', type=u'datafeed', name=u'test', url=u'test')
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    42
        cnx.commit()
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    43
        metagen = stores.MetadataGenerator(cnx, source=cnx.repo.sources_by_eid[source.eid])
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    44
        return MassiveObjectStore(cnx, metagen=metagen)
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    45
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
    46
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    47
class MassImportSimpleTC(testlib.CubicWebTC):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    48
    configcls = PostgresApptestConfiguration
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    49
    appid = 'data-massimport'
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    50
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    51
    def cast(self, _type, value):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    52
        try:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    53
            return _type(value)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    54
        except ValueError:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    55
            return None
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    56
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    57
    def push_geonames_data(self, dumpname, store):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    58
        # Push timezones
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    59
        cnx = store._cnx
10938
a24a13742f3c [test] Use datapath() in massive store tests
Denis Laxalde <denis.laxalde@logilab.fr>
parents: 10907
diff changeset
    60
        for code, gmt, dst, raw_offset in ucsvreader(open(self.datapath('timeZones.txt'), 'rb'),
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    61
                                                     delimiter='\t'):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    62
            cnx.create_entity('TimeZone', code=code, gmt=float(gmt),
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    63
                                    dst=float(dst), raw_offset=float(raw_offset))
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    64
        timezone_code = dict(cnx.execute('Any C, X WHERE X is TimeZone, X code C'))
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    65
        # Push data
10858
1a3e56e346d2 [dataimport/test] feed binary data to ucsvreader
Julien Cristau <julien.cristau@logilab.fr>
parents: 10857
diff changeset
    66
        for ind, infos in enumerate(ucsvreader(open(dumpname, 'rb'),
10868
ffb5b6c25cec [dataimport/test] update call to ucsvreader
Julien Cristau <julien.cristau@logilab.fr>
parents: 10866
diff changeset
    67
                                               delimiter='\t',
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    68
                                               ignore_errors=True)):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    69
            latitude = self.cast(float, infos[4])
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    70
            longitude = self.cast(float, infos[5])
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    71
            population = self.cast(int, infos[14])
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    72
            elevation = self.cast(int, infos[15])
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    73
            gtopo = self.cast(int, infos[16])
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    74
            feature_class = infos[6]
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    75
            if len(infos[6]) != 1:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    76
                feature_class = None
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    77
            entity = {'name': infos[1],
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    78
                      'asciiname': infos[2],
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    79
                      'alternatenames': infos[3],
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    80
                      'latitude': latitude, 'longitude': longitude,
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    81
                      'feature_class': feature_class,
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    82
                      'alternate_country_code':infos[9],
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    83
                      'admin_code_3': infos[12],
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    84
                      'admin_code_4': infos[13],
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    85
                      'population': population, 'elevation': elevation,
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    86
                      'gtopo30': gtopo, 'timezone': timezone_code.get(infos[17]),
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    87
                      'cwuri':  u'http://sws.geonames.org/%s/' % int(infos[0]),
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    88
                      'geonameid': int(infos[0]),
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    89
                      }
10863
8e1f6de61300 [dataimport] implement new store API on massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 10860
diff changeset
    90
            store.prepare_insert_entity('Location', **entity)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    91
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    92
    def test_autoflush_metadata(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    93
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    94
            crs = cnx.system_sql('SELECT * FROM entities WHERE type=%(t)s',
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    95
                                 {'t': 'Location'})
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    96
            self.assertEqual(len(crs.fetchall()), 0)
10875
75d1b2d66f18 [dataimport] remove autoflush_metadata from MassiveObjectStore parameters
Julien Cristau <julien.cristau@logilab.fr>
parents: 10872
diff changeset
    97
            store = MassiveObjectStore(cnx)
10863
8e1f6de61300 [dataimport] implement new store API on massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 10860
diff changeset
    98
            store.prepare_insert_entity('Location', name=u'toto')
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
    99
            store.flush()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   100
            store.commit()
10866
ed62ba97d79e [dataimport/test] use store.finish instead of deprecated store.cleanup
Julien Cristau <julien.cristau@logilab.fr>
parents: 10863
diff changeset
   101
            store.finish()
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   102
            cnx.commit()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   103
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   104
            crs = cnx.system_sql('SELECT * FROM entities WHERE type=%(t)s',
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   105
                                 {'t': 'Location'})
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   106
            self.assertEqual(len(crs.fetchall()), 1)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   107
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   108
    def test_massimport_etype_metadata(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   109
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   110
            store = MassiveObjectStore(cnx)
10863
8e1f6de61300 [dataimport] implement new store API on massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 10860
diff changeset
   111
            timezone_eid = store.prepare_insert_entity('TimeZone')
8e1f6de61300 [dataimport] implement new store API on massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 10860
diff changeset
   112
            store.prepare_insert_entity('Location', timezone=timezone_eid)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   113
            store.flush()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   114
            store.commit()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   115
            eid, etname = cnx.execute('Any X, TN WHERE X timezone TZ, X is T, '
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   116
                                      'T name TN')[0]
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   117
            self.assertEqual(cnx.entity_from_eid(eid).cw_etype, etname)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   118
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   119
    def test_drop_index(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   120
        with self.admin_access.repo_cnx() as cnx:
10869
575982c948a9 [dataimport] remove drop_index parameter from massive store
Julien Cristau <julien.cristau@logilab.fr>
parents: 10868
diff changeset
   121
            store = MassiveObjectStore(cnx)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   122
            cnx.commit()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   123
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   124
            crs = cnx.system_sql('SELECT indexname FROM pg_indexes')
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   125
            indexes = [r[0] for r in crs.fetchall()]
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   126
        self.assertNotIn('entities_pkey', indexes)
11095
02e88ca3bc23 [dataimport/test] fix test failures since 35b29f1eb37a
Julien Cristau <julien.cristau@logilab.fr>
parents: 11057
diff changeset
   127
        self.assertNotIn('entities_extid_idx', indexes)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   128
        self.assertNotIn('owned_by_relation_pkey', indexes)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   129
        self.assertNotIn('owned_by_relation_to_idx', indexes)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   130
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   131
    def test_drop_index_recreation(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   132
        with self.admin_access.repo_cnx() as cnx:
10869
575982c948a9 [dataimport] remove drop_index parameter from massive store
Julien Cristau <julien.cristau@logilab.fr>
parents: 10868
diff changeset
   133
            store = MassiveObjectStore(cnx)
10866
ed62ba97d79e [dataimport/test] use store.finish instead of deprecated store.cleanup
Julien Cristau <julien.cristau@logilab.fr>
parents: 10863
diff changeset
   134
            store.finish()
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   135
            cnx.commit()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   136
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   137
            crs = cnx.system_sql('SELECT indexname FROM pg_indexes')
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   138
            indexes = [r[0] for r in crs.fetchall()]
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   139
        self.assertIn('entities_pkey', indexes)
11095
02e88ca3bc23 [dataimport/test] fix test failures since 35b29f1eb37a
Julien Cristau <julien.cristau@logilab.fr>
parents: 11057
diff changeset
   140
        self.assertIn('entities_extid_idx', indexes)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   141
        self.assertIn('owned_by_relation_p_key', indexes)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   142
        self.assertIn('owned_by_relation_to_idx', indexes)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   143
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   144
    def test_eids_seq_range(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   145
        with self.admin_access.repo_cnx() as cnx:
11028
66f94d7f9ca7 [dataimport] make eids_seq_range as massive store instance attribute again
Julien Cristau <julien.cristau@logilab.fr>
parents: 11026
diff changeset
   146
            store = MassiveObjectStore(cnx, eids_seq_range=1000)
11026
ce9b3886955d [dataimport] remove eids_seq_start attribute from massive store
Julien Cristau <julien.cristau@logilab.fr>
parents: 11024
diff changeset
   147
            store.restart_eid_sequence(50000)
10863
8e1f6de61300 [dataimport] implement new store API on massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 10860
diff changeset
   148
            store.prepare_insert_entity('Location', name=u'toto')
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   149
            store.flush()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   150
            cnx.commit()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   151
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   152
            crs = cnx.system_sql("SELECT * FROM entities_id_seq")
10860
252877c624f0 [dataimport/test] use the right assert methods instead of assertTrue with a comparison
Julien Cristau <julien.cristau@logilab.fr>
parents: 10858
diff changeset
   153
            self.assertGreater(crs.fetchone()[0], 50000)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   154
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   155
    def test_eid_entity(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   156
        with self.admin_access.repo_cnx() as cnx:
11028
66f94d7f9ca7 [dataimport] make eids_seq_range as massive store instance attribute again
Julien Cristau <julien.cristau@logilab.fr>
parents: 11026
diff changeset
   157
            store = MassiveObjectStore(cnx, eids_seq_range=1000)
11026
ce9b3886955d [dataimport] remove eids_seq_start attribute from massive store
Julien Cristau <julien.cristau@logilab.fr>
parents: 11024
diff changeset
   158
            store.restart_eid_sequence(50000)
10863
8e1f6de61300 [dataimport] implement new store API on massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 10860
diff changeset
   159
            eid = store.prepare_insert_entity('Location', name=u'toto')
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   160
            store.flush()
10863
8e1f6de61300 [dataimport] implement new store API on massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 10860
diff changeset
   161
            self.assertGreater(eid, 50000)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   162
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   163
    def test_eid_entity_2(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   164
        with self.admin_access.repo_cnx() as cnx:
11026
ce9b3886955d [dataimport] remove eids_seq_start attribute from massive store
Julien Cristau <julien.cristau@logilab.fr>
parents: 11024
diff changeset
   165
            store = MassiveObjectStore(cnx)
ce9b3886955d [dataimport] remove eids_seq_start attribute from massive store
Julien Cristau <julien.cristau@logilab.fr>
parents: 11024
diff changeset
   166
            store.restart_eid_sequence(50000)
10863
8e1f6de61300 [dataimport] implement new store API on massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 10860
diff changeset
   167
            eid = store.prepare_insert_entity('Location', name=u'toto', eid=10000)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   168
            store.flush()
10863
8e1f6de61300 [dataimport] implement new store API on massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 10860
diff changeset
   169
        self.assertEqual(eid, 10000)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   170
11024
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   171
    @staticmethod
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   172
    def get_db_descr(cnx):
11310
e0b7277e5394 [dataimport] PGHelper should be responsible to retrieve the database schema
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11095
diff changeset
   173
        pgh = PGHelper(cnx)
11024
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   174
        all_tables = cnx.system_sql('''
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   175
SELECT table_name
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   176
FROM information_schema.tables
11310
e0b7277e5394 [dataimport] PGHelper should be responsible to retrieve the database schema
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11095
diff changeset
   177
where table_schema = %(s)s''', {'s': pgh.pg_schema}).fetchall()
11024
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   178
        all_tables_descr = {}
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   179
        for tablename, in all_tables:
11314
c258bd6b20d8 [dataimport] rework PGHelper class
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11310
diff changeset
   180
            all_tables_descr[tablename] = set(pgh.table_indexes(tablename)).union(
c258bd6b20d8 [dataimport] rework PGHelper class
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11310
diff changeset
   181
                set(pgh.table_constraints(tablename)))
11024
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   182
        return all_tables_descr
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   183
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   184
    def test_identical_schema(self):
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   185
        with self.admin_access.repo_cnx() as cnx:
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   186
            init_descr = self.get_db_descr(cnx)
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   187
        with self.admin_access.repo_cnx() as cnx:
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   188
            store = MassiveObjectStore(cnx)
11321
fab543f542ac [dataimport] inline some methods of the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11314
diff changeset
   189
            store.prepare_insert_entity('Location', name=u'toto')
11024
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   190
            store.finish()
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   191
        with self.admin_access.repo_cnx() as cnx:
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   192
            final_descr = self.get_db_descr(cnx)
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   193
        self.assertEqual(init_descr, final_descr)
dc70698dcf6c [dataimport] check that MassiveObjectStore restores the db schema properly
Samuel Trégouët <samuel.tregouet@logilab.fr>
parents: 10938
diff changeset
   194
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   195
    def test_on_commit_callback(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   196
        counter = itertools.count()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   197
        with self.admin_access.repo_cnx() as cnx:
11328
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
   198
            store = MassiveObjectStore(cnx, on_commit_callback=lambda: next(counter))
10863
8e1f6de61300 [dataimport] implement new store API on massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 10860
diff changeset
   199
            store.prepare_insert_entity('Location', name=u'toto')
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   200
            store.flush()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   201
            store.commit()
11328
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
   202
        self.assertEqual(next(counter), 1)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   203
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   204
    def test_on_rollback_callback(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   205
        counter = itertools.count()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   206
        with self.admin_access.repo_cnx() as cnx:
10855
cd91f46fa633 [dataimport] use next builtin instead of next method on iterators
Julien Cristau <julien.cristau@logilab.fr>
parents: 10853
diff changeset
   207
            store = MassiveObjectStore(cnx, on_rollback_callback=lambda *_: next(counter))
10863
8e1f6de61300 [dataimport] implement new store API on massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 10860
diff changeset
   208
            store.prepare_insert_entity('Location', nm='toto')
11328
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
   209
            store.commit()  # commit modification to the database before flush
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   210
            store.flush()
11328
9f2d7da47526 [dataimport] test and fix external source support for the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11324
diff changeset
   211
        self.assertEqual(next(counter), 1)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   212
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   213
    def test_slave_mode_indexes(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   214
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   215
            slave_store = MassiveObjectStore(cnx, slave_mode=True)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   216
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   217
            crs = cnx.system_sql('SELECT indexname FROM pg_indexes')
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   218
            indexes = [r[0] for r in crs.fetchall()]
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   219
        self.assertIn('entities_pkey', indexes)
11095
02e88ca3bc23 [dataimport/test] fix test failures since 35b29f1eb37a
Julien Cristau <julien.cristau@logilab.fr>
parents: 11057
diff changeset
   220
        self.assertIn('entities_extid_idx', indexes)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   221
        self.assertIn('owned_by_relation_p_key', indexes)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   222
        self.assertIn('owned_by_relation_to_idx', indexes)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   223
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   224
    def test_slave_mode_exception(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   225
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   226
            master_store = MassiveObjectStore(cnx, slave_mode=False)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   227
            slave_store = MassiveObjectStore(cnx, slave_mode=True)
10866
ed62ba97d79e [dataimport/test] use store.finish instead of deprecated store.cleanup
Julien Cristau <julien.cristau@logilab.fr>
parents: 10863
diff changeset
   228
            self.assertRaises(RuntimeError, slave_store.finish)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   229
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   230
    def test_simple_insert(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   231
        with self.admin_access.repo_cnx() as cnx:
10875
75d1b2d66f18 [dataimport] remove autoflush_metadata from MassiveObjectStore parameters
Julien Cristau <julien.cristau@logilab.fr>
parents: 10872
diff changeset
   232
            store = MassiveObjectStore(cnx)
10938
a24a13742f3c [test] Use datapath() in massive store tests
Denis Laxalde <denis.laxalde@logilab.fr>
parents: 10907
diff changeset
   233
            self.push_geonames_data(self.datapath('geonames.csv'), store)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   234
            store.flush()
10880
5fb592895e0f [dataimport] remove implicit commits from MassiveObjectStore
Julien Cristau <julien.cristau@logilab.fr>
parents: 10875
diff changeset
   235
            store.commit()
5fb592895e0f [dataimport] remove implicit commits from MassiveObjectStore
Julien Cristau <julien.cristau@logilab.fr>
parents: 10875
diff changeset
   236
            store.finish()
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   237
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   238
            rset = cnx.execute('Any X WHERE X is Location')
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   239
            self.assertEqual(len(rset), 4000)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   240
            rset = cnx.execute('Any X WHERE X is Location, X timezone T')
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   241
            self.assertEqual(len(rset), 4000)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   242
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   243
    def test_index_building(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   244
        with self.admin_access.repo_cnx() as cnx:
10875
75d1b2d66f18 [dataimport] remove autoflush_metadata from MassiveObjectStore parameters
Julien Cristau <julien.cristau@logilab.fr>
parents: 10872
diff changeset
   245
            store = MassiveObjectStore(cnx)
10938
a24a13742f3c [test] Use datapath() in massive store tests
Denis Laxalde <denis.laxalde@logilab.fr>
parents: 10907
diff changeset
   246
            self.push_geonames_data(self.datapath('geonames.csv'), store)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   247
            store.flush()
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   248
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   249
            # Check index
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   250
            crs = cnx.system_sql('SELECT indexname FROM pg_indexes')
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   251
            indexes = [r[0] for r in crs.fetchall()]
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   252
            self.assertNotIn('entities_pkey', indexes)
11095
02e88ca3bc23 [dataimport/test] fix test failures since 35b29f1eb37a
Julien Cristau <julien.cristau@logilab.fr>
parents: 11057
diff changeset
   253
            self.assertNotIn('entities_extid_idx', indexes)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   254
            self.assertNotIn('owned_by_relation_p_key', indexes)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   255
            self.assertNotIn('owned_by_relation_to_idx', indexes)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   256
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   257
            # Cleanup -> index
10866
ed62ba97d79e [dataimport/test] use store.finish instead of deprecated store.cleanup
Julien Cristau <julien.cristau@logilab.fr>
parents: 10863
diff changeset
   258
            store.finish()
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   259
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   260
            # Check index again
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   261
            crs = cnx.system_sql('SELECT indexname FROM pg_indexes')
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   262
            indexes = [r[0] for r in crs.fetchall()]
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   263
            self.assertIn('entities_pkey', indexes)
11095
02e88ca3bc23 [dataimport/test] fix test failures since 35b29f1eb37a
Julien Cristau <julien.cristau@logilab.fr>
parents: 11057
diff changeset
   264
            self.assertIn('entities_extid_idx', indexes)
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   265
            self.assertIn('owned_by_relation_p_key', indexes)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   266
            self.assertIn('owned_by_relation_to_idx', indexes)
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   267
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   268
    def test_multiple_insert(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   269
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   270
            store = MassiveObjectStore(cnx)
11321
fab543f542ac [dataimport] inline some methods of the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11314
diff changeset
   271
            store.prepare_insert_entity('Location', name=u'toto')
10866
ed62ba97d79e [dataimport/test] use store.finish instead of deprecated store.cleanup
Julien Cristau <julien.cristau@logilab.fr>
parents: 10863
diff changeset
   272
            store.finish()
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   273
            store = MassiveObjectStore(cnx)
11321
fab543f542ac [dataimport] inline some methods of the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11314
diff changeset
   274
            store.prepare_insert_entity('Location', name=u'toto')
10866
ed62ba97d79e [dataimport/test] use store.finish instead of deprecated store.cleanup
Julien Cristau <julien.cristau@logilab.fr>
parents: 10863
diff changeset
   275
            store.finish()
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   276
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   277
    def test_multiple_insert_relation(self):
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   278
        with self.admin_access.repo_cnx() as cnx:
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   279
            store = MassiveObjectStore(cnx)
11321
fab543f542ac [dataimport] inline some methods of the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11314
diff changeset
   280
            store.init_rtype_table('Country', 'used_language', 'Language')
10866
ed62ba97d79e [dataimport/test] use store.finish instead of deprecated store.cleanup
Julien Cristau <julien.cristau@logilab.fr>
parents: 10863
diff changeset
   281
            store.finish()
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   282
            store = MassiveObjectStore(cnx)
11321
fab543f542ac [dataimport] inline some methods of the massive store
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents: 11314
diff changeset
   283
            store.init_rtype_table('Country', 'used_language', 'Language')
10866
ed62ba97d79e [dataimport/test] use store.finish instead of deprecated store.cleanup
Julien Cristau <julien.cristau@logilab.fr>
parents: 10863
diff changeset
   284
            store.finish()
10853
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   285
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   286
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   287
if __name__ == '__main__':
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   288
    from logilab.common.testlib import unittest_main
de741492538d [dataimport] backport massive store from dataio cube
Sylvain Thénault <sylvain.thenault@logilab.fr>
parents:
diff changeset
   289
    unittest_main()