In [1]:
import seaborn as sns
import metapack as mp
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from IPython.display import display 

%matplotlib inline
sns.set_context('notebook')
mp.jupyter.init()

In [1]:
parking_pkg = mp.open_package('http://library.metatab.org/sandiego.gov-cityiq_parking-2.csv')
parking_pkg


Out[1]:

San Diego Parking Time Series

sandiego.gov-cityiq_parking-2 Last Update: 2019-02-16T05:50:44

15 minute interval parking utilization for 1600 parking zones in San Diego city.

This datasets is compiled from parking events scraped from the San Diego CityIQ smart streetmap system, via the cityiq Python package. The dataset is compiled from PKIN and PKOUT events between the dates of Sept 2018 and Feb 2019 for the whole SaN Diego system.

The dataset is heavily processed to eliminate duplicate events because there are many spurious events, but an excess of PKIN events. When computing the number of cars parked in all parking zones, the excess of PKIN events results in about 60,000 extra cars per month. These issues are explored in an Jupyter Notebook

The records in this dataset referece parking zones. More information, including geographic positions, are avialble in the CityIQ Objects dataset.

Processing

These data were produced with these programs:

$ pip install cityiq
$ ciq_config -w
$ # Edit .cityiq-config.yaml with client-id and secret
# Scrape PKIN and PKOUT from Sept 2018 to present
$ $ ciq_events -s -e PKIN -e PKOUT -t 20190901
# Split event dump in to event-location csv files
$ ciq_events -S
# Deduplicate and normalize
$ ciq_events -n

The last step, deduplication and normalization, involves these steps:

  • Group events by event type, location and 1 second period and select only 1 record from each group
  • Collect runs of a events of one type and select only the first record of the run, up to a run of 4 minutes long
  • For each location, compute the cumulative sum of in and outs ( calculating the number of cars in the zone ) then create a rolling 2-day average. Subtract off the average.

The third step is demonstrated in this image:

The blue line is the original utilization for a single location, showing the larger number of PKIN events than PKOUT events. The red line is the 2-day rolling average, and the green line is after subtracting the 2-dat rolling average.

In the final dataset, the data for the blue line is in the cs column, which is created from the cumulative sum of the delta column. The green line is the data in the cs_norm column, which is differentiated to create the delta_normcolumn.

For most purpuses you should use cs_norm and delta_norm.

Contacts

References


In [13]:
meta_pkg = mp.open_package('http://library.metatab.org/sandiego.gov-cityiq_objects-2.csv')
meta_pkg


Out[13]:

San Diego City IQ Assets and Locations

sandiego.gov-cityiq_objects-2 Last Update: 2019-02-16T06:29:45

All assets types extracted form the San Diego City Iq system

These datafile are extracts of the assets and locations from the San Diego CityIQ system. Refer to the CityIQ developer documentation for details about these data records. The data are extracted using the cityiq Python package. See the ExtractAssets.ipynb notebook for the extract process.

Contacts

Resources

  • assets. All assets
  • locations. All locations: walkways, traffic lanes and parking zones

References


In [14]:
locations = meta_pkg.resource('locations').dataframe()
pzones = locations[locations.locationtype == 'PARKING_ZONE']

In [15]:
pzones.head()


Out[15]:
locationuid locationtype parentlocationuid geometry
93 03e23a6ace01fc14a86ab250ce65b682 PARKING_ZONE 03e23a6ace01fc14a86ab250ce65b682 40053.0
115 051zxsk4ofy6jl2daq51 PARKING_ZONE 051zxsk4ofy6jl2daq51 2027.0
131 05yh4xfuhdvbjl5dc16y PARKING_ZONE 05yh4xfuhdvbjl5dc16y 528.0
139 06c474fb976e2f394b69309d41d19f7f PARKING_ZONE 06c474fb976e2f394b69309d41d19f7f 9022.0
165 07c91510cc45f849766b1d3bbab46e90 PARKING_ZONE 07c91510cc45f849766b1d3bbab46e90 838.0

In [ ]: