[repo] optimize massive insertion/deletion by using the new set_operation function
Idea is that on massive insertion, cost of handling the list of operation
become non negligeable, so we should minimize the number of operations in
that list.
The set_operation function ease usage of operation associated to data in
session.transaction_data, and we only add the operation when data set isn't
initialized yet, else we simply add data to the set. The operation then
simply process accumulated data.
include README
include pylintrc
include bin/cubicweb-*
include man/cubicweb-ctl.1
recursive-include doc *.txt *.zargo *.png *.html makefile *.rst
recursive-include misc *
recursive-include web/data *
recursive-include web/wdoc *.rst *.png *.xml ChangeLog*
include web/views/*.pt
recursive-include etwist *.xml *.html
recursive-include i18n *.pot *.po
recursive-include schemas *.py *.sql.*
recursive-include entities/test/data *
recursive-include sobjects/test/data *
recursive-include server/test/data *
recursive-include server/test sources*
recursive-include web/test/data *.js *.css *.png *.gif *.jpg *.ico external_resources
recursive-include devtools/test/data *
recursive-include skeleton *.py *.css *.js *.po compat *.in *.tmpl
prune misc/cwfs