improve massive deletion performance
change hooks.integrity._DelayedDeleteOp implementation to give it a chance of
processing the entities by chunks of reasonnable size (500 entities at a time)
adapt ssplanner.DeleteEntitiesStep to call a variant of glob_delete_entity with several entities.
That variant calls all the before_delete_entities hooks in one go, then
performs the deletion, and then calls all the after_delete_entities hooks. The
deletion is performed by grouping together entities by etype and by source.
adapt the HooksManager to call the hooks on a list of entities instead of on a single entity.
adapt the sources to be able to delete several entities of the same etype at once.
changed the source fti_(un)index_entity methods to fti_(un)index_entities taking a collection of entities.
#!/usr/bin/python
"""usage: fix-po-encodings [filename...]
change the encoding of the po files passed as arguments to utf-8
"""
import sys
import re
import codecs
def change_encoding(filename, target='UTF-8'):
fdesc = open(filename)
data = fdesc.read()
fdesc.close()
encoding = find_encoding(data)
if encoding == target:
return
data = fix_encoding(data, target)
data = unicode(data, encoding)
fdesc = codecs.open(filename, 'wb', encoding=target)
fdesc.write(data)
fdesc.close()
def find_encoding(data):
regexp = re.compile(r'"Content-Type:.* charset=([a-zA-Z0-9-]+)\\n"', re.M)
mo = regexp.search(data)
if mo is None:
raise ValueError('No encoding declaration')
return mo.group(1)
def fix_encoding(data, target_encoding):
regexp = re.compile(r'("Content-Type:.* charset=)(.*)(\\n")', re.M)
return regexp.sub(r'\1%s\3' % target_encoding, data)
for filename in sys.argv[1:]:
print filename
change_encoding(filename)