[repo] optimize massive insertion/deletion by using the new set_operation function
Idea is that on massive insertion, cost of handling the list of operation
become non negligeable, so we should minimize the number of operations in
that list.
The set_operation function ease usage of operation associated to data in
session.transaction_data, and we only add the operation when data set isn't
initialized yet, else we simply add data to the set. The operation then
simply process accumulated data.
"""
:organization: Logilab
:copyright: 2001-2010 LOGILAB S.A. (Paris, FRANCE), license is LGPL v2.
:contact: http://www.logilab.fr/ -- mailto:contact@logilab.fr
:license: GNU Lesser General Public License, v2.1 - http://www.gnu.org/licenses
"""
from logilab.common.testlib import TestCase, unittest_main
from cubicweb.devtools.fake import FakeRequest
class AjaxReplaceUrlTC(TestCase):
def test_ajax_replace_url(self):
req = FakeRequest()
arurl = req.build_ajax_replace_url
# NOTE: for the simplest use cases, we could use doctest
self.assertEquals(arurl('foo', 'Person P', 'list'),
"javascript: loadxhtml('foo', 'http://testing.fr/cubicweb/view?rql=Person%20P&__notemplate=1&vid=list', 'replace')")
self.assertEquals(arurl('foo', 'Person P', 'oneline', name='bar', age=12),
'''javascript: loadxhtml('foo', 'http://testing.fr/cubicweb/view?age=12&rql=Person%20P&__notemplate=1&vid=oneline&name=bar', 'replace')''')
if __name__ == '__main__':
unittest_main()