[dataimport] backout 6947201033be (related to #2788402)
(and add a try: except to cache the intended error)
The problem actually comes from the ``MassiveObjectStore``
in the ``dataio`` cube, so it should be corrected there.
Here, we only protect it with a ``RuntimeWarning`` so that
the user can see the problem.
``value`` is set to ``None`` (whence to ``NULL`` from
a database standpoint), so that the data can be nevertheless
inserted in the database. However, only the keys present in
``row`` are actually non-'``NULL``'.
The real solution is to work out the issue in ``MassiveObjectStore``
directly.
The current try/except should only be a temporary hack.
--- a/dataimport.py Fri Apr 05 14:44:03 2013 +0200
+++ b/dataimport.py Thu Apr 04 11:58:41 2013 +0200
@@ -70,6 +70,7 @@
import sys
import threading
import traceback
+import warnings
import cPickle
import os.path as osp
import inspect
@@ -431,14 +432,16 @@
# If an error is raised, do not continue.
formatted_row = []
for col in columns:
- if isinstance(row, dict):
- value = row.get(col)
- elif isinstance(row, (tuple, list)):
+ try:
value = row[col]
- else:
- raise ValueError("Input data of improper type: %s; "
- "expected tuple, list or dict."
- % type(row).__name__)
+ except KeyError:
+ warnings.warn(u"Column %s is not accessible in row %s"
+ % (col, row), RuntimeWarning)
+ # XXX 'value' set to None so that the import does not end in
+ # error.
+ # Instead, the extra keys are set to NULL from the
+ # database point of view.
+ value = None
if value is None:
value = 'NULL'
elif isinstance(value, (long, int, float)):