Using Arc 10.0 SP3
I am having some problems using field mapping and python. If I run the code without the field mapping parameter it executes fine, just exporting all the fields in the source table. The problem seems to be the field type is changing. Here is my code.
Here is the error I get.
Here is a dump of the mapped variable that is created.
Using ArcMap, here is a dump of the same tool run successfully with the fields in the output as expected.
I highlighted what appears to be the problem. For some reason my coded field mapping function changes the field data type for BLOCKOID to short when it should be long, bombing out when it gets to a number above the limit of a short type (‘87870’).
I'm assuming a sample of a portion of the rows is taken to create the output field type instead of reading the input source field type. Has anyone observed this behaviour or know a work around? I can't seem to find a way to set the output field type which would also solve the issue.
I am having some problems using field mapping and python. If I run the code without the field mapping parameter it executes fine, just exporting all the fields in the source table. The problem seems to be the field type is changing. Here is my code.
Code:
import os
import arcpy
def getFieldMappings(fc_in, dico):
field_mappings = arcpy.FieldMappings()
for input, output in dico.iteritems():
field_map = arcpy.FieldMap()
field_map.addInputField(fc_in, input)
field = field_map.outputField
field.name = output
field_map.outputField = field
field_mappings.addFieldMap(field_map)
del field, field_map
return field_mappings
block_id_field = "BLOCKOID"
harv_field_list = [block_id_field, "STARTHARVESTDATE", "SKIDCLEARDATE", "HAULCLEARDATE", "FINALCLEARDATE"]
harv_dico = {
harv_field_list[0]: "BLOCK_ID",
harv_field_list[1]: "START_HARV",
harv_field_list[2]: "SKID",
harv_field_list[3]: "HAUL",
harv_field_list[4]: "FINAL"}
table = r"Database Connections/TFM.sde/TFM.V_TFM_HARVEST"
table_out = r"in_memory\temp_table"
if arcpy.Exists(table_out):
arcpy.Delete_management(table_out)
mapped = getFieldMappings(table, harv_dico)
arcpy.TableToTable_conversion(table, os.path.dirname(table_out), os.path.basename(table_out), "", mapped)
Code:
Traceback (most recent call last):
File "<string>", line 254, in run_nodebug
File "<module1>", line 39, in <module>
File "C:\Program Files (x86)\ArcGIS\Desktop10.0\arcpy\arcpy\conversion.py", line 1463, in TableToTable
raise e
arcgisscripting.ExecuteError: ERROR 000278: Failed on input OID -1, could not write value ‘87870’ to output field BLOCK_ID
Failed to execute (TableToTable).
Code:
HAUL "HAULCLEARDATE" true true false 36 Date 0 0 ,First,#,Database Connections/TFM.sde/TFM.V_TFM_HARVEST,HAULCLEARDATE,-1,-1;
START_HARV "STARTHARVESTDATE" true true false 36 Date 0 0 ,First,#,Database Connections/TFM.sde/TFM.V_TFM_HARVEST,STARTHARVESTDATE,-1,-1;
BLOCK_ID "BLOCKOID" true true false 4 Short 0 9 ,First,#,Database Connections/TFM.sde/TFM.V_TFM_HARVEST,BLOCKOID,-1,-1;
FINAL "FINALCLEARDATE" true true false 36 Date 0 0 ,First,#,Database Connections/TFM.sde/TFM.V_TFM_HARVEST,FINALCLEARDATE,-1,-1;
SKID "SKIDCLEARDATE" true true false 36 Date 0 0 ,First,#,Database Connections/TFM.sde/TFM.V_TFM_HARVEST,SKIDCLEARDATE,-1,-1
Code:
BLOCKOID /\BLOCKOID/\ true true false 4 Long 0 9 ,First,#,Database Connections/TFM.sde/TFM.V_TFM_HARVEST,BLOCKOID,-1,-1;
STARTHARVESTDATE /\STARTHARVESTDATE/\ true true false 36 Date 0 0 ,First,#,Database Connections/TFM.sde/TFM.V_TFM_HARVEST,STARTHARVESTDATE,-1,-1;
FINALCLEARDATE /\FINALCLEARDATE/\ true true false 36 Date 0 0 ,First,#,Database Connections/TFM.sde/TFM.V_TFM_HARVEST,FINALCLEARDATE,-1,-1;
HAULCLEARDATE /\HAULCLEARDATE/\ true true false 36 Date 0 0 ,First,#,Database Connections/TFM.sde/TFM.V_TFM_HARVEST,HAULCLEARDATE,-1,-1;
SKIDCLEARDATE /\SKIDCLEARDATE/\ true true false 36 Date 0 0 ,First,#,Database Connections/TFM.sde/TFM.V_TFM_HARVEST,SKIDCLEARDATE,-1,-1
I'm assuming a sample of a portion of the rows is taken to create the output field type instead of reading the input source field type. Has anyone observed this behaviour or know a work around? I can't seem to find a way to set the output field type which would also solve the issue.