Quantcast
Channel: Forums - Python
Viewing all articles
Browse latest Browse all 2485

controlling cell size of output in raster operation

$
0
0
Another updating from 9.2 to 10.1 script issue. I have a Python script I'm updating, wherein there is a raster calculator calculation made via a string expression I want to update. The calculation is a sum, where the cell size of the original rasters varies, and (for reasons I won't go into) I want the output to preserve the finest resolution of the rasters (this is put in a variable named pOrigCellSize, see below) I can't seem to do so, however. I set the arcpy.env.cellSize to what I wish the output to be, but to no avail. Here is the code snippet:

# we are in a loop that goes from 8 : 1
arcpy.env.CellSize = pOrigCellSize

if pLevel == 8 :
seq_ras = Raster(gRM)
else:
seq_ras = seq_ras + Raster(gRM)

The seq_ras raster result has the lowest cell size of the input rasters; not a cell size of pOrigCellSize, which is what I want (and yes I checked that variable in the stack in a debug session). What am I doing wrong?

Viewing all articles
Browse latest Browse all 2485

Trending Articles