The optimize-waterallocation.jl
script optimizes the joint extraction of surface and groundwater withdrawals and reservoir storage to satisfy given water demands. It produces four files:
withdrawals.jld
: The amount withdrawn from each gauge along each canal.returns.jld
: The amount returned to each gauge along each canal.captures.jld
: The amount stored in each reservoir from one period to the next.waterfromgw.jld
: The amount of pumped water from the groundwater for each county.How does the optimal division between ground and surface water differ from the recorded values?
First, let's take the recorded values.
In [2]:
using DataFrames
df = readtable("../data/extraction/USGS-2010.csv")[:, [:FIPS, :TO_SW, :TO_GW, :TO_To]]
Out[2]:
In [3]:
surfaces = deserialize(open("../data/cache/counties/extraction/withdrawals.jld"));
grounds = deserialize(open("../data/cache/counties/extraction/waterfromgw.jld"))
Out[3]:
Sum these to the county-year level.
In [12]:
drawsdata = read_rda("../data/countydraws.RData", convertdataframes=true);
draws = drawsdata["draws"]
optsw = []
optgw = []
for ii in 1:size(grounds, 1)
push!(optsw, sum(surfaces[draws[:fips] .== df[ii, :FIPS], :]))
push!(optgw, sum(grounds[ii, :]))
end
df[:optsw] = optsw / (1382592. / 1000.)
df[:optgw] = optgw / (1382592. / 1000.)
df
Out[12]:
In [13]:
include("../src/lib/graphing.jl")
Out[13]:
Where do we have under-allocations-- that is, where the optimized values do not reach the recorded levels?
In [15]:
df[:fips] = df[:FIPS]
df[:value] = convert(DataVector{Float64}, df[:TO_To] - df[:optsw] - df[:optgw])
usmap(df, true)
Out[15]:
Where do we use more or less GW than recorded (since this is the current source of prices)?
In [18]:
df[:value] = convert(DataVector{Float64}, (df[:TO_GW] - df[:optgw]))
usmap(df, true)
Out[18]:
In [ ]: