Quantcast
Channel: Forums - Python
Viewing all articles
Browse latest Browse all 2485

insertion of concatenated records

$
0
0
Hello :)

I'm currently trying to load as simply as possible records of the form: record = 'value1, value2, ... ,value17' into a layer.

so far i export my data as csv files and import them using arcpy MakeXYEventLayer_management and FeatureClassToFeatureClass_conversion (then append them to some layer).

Code:

arcpy.MakeXYEventLayer_management(my_csv, "longitude", "latitude", xy_layer, my_sr)
arcpy.FeatureClassToFeatureClass_conversion(xy_layer, my_gdb, "new_fclass", fieldmappings)
arcpy.Append_management(my_gdb+"\\new_fclass", otherTable, "TEST", "", "")

I now try to improve the loading speed and get rid of the csv file. And that should do the trick:

Code:

fieldmappings = arcpy.FieldMappings()
fieldmappings.addTable(otherTable)

cursor = arcpy.InsertCursor(otherTable)
pnt = arcpy.Point()

for row in data: # from cx_oracle.cursor
        feat = cursor.newRow()
        vals = row.split(",")
        # spatial reference change?
        pnt.X = float(vals[9]) # long
        pnt.Y = float(vals[8]) # lat
        feat.SHAPE = pnt
       
        for i in range(fieldmappings.fieldCount):
                feat.setValue(fieldmappings.getFieldMap(i).name, vals[i])
       
        cursor.insertRow(feat)


Do I have a way to do it in a cleaner way?
Can I have better results by inserting my records into a temporary ValueTable or use an Editor? (and where to use the fieldmappings in this case?)

would ArcSDESQLExecute or "in_memory" temp table be faster solutions?

Viewing all articles
Browse latest Browse all 2485

Trending Articles