Using code created by the Los Angeles Times Data Desk and its Checkbook LA Watchdog project, this is a "periodically updated archive of data" published by the city of Los Angeles on its open data portal.
If a given csv file and its corresponding json file is below 10,000 rows, the file will be linked to a location in this repo. Otherwise it will be designated n/a
.
I'm storing files larger than 10,000 rows locally, but at my knowledge level they are too large for GitHub, and too large to handle efficiently in a local repository.
csv/Catalog.csv | 34350 +++++++++++-------
json/Catalog.json | 35302 +++++++++++--------
json/Messages.json | 12 +-
json/Templates.json | 12 +-
...nthly-residential-energy-usage-by-zip-code.json | 4 +-
json/bavn-open-bid-opportunities.json | 319 +-
json/water-and-electric-rate-zones.json | 4 +-
...k-source-centers-location-and-contact-info.json | 6 +-
watchdog.py | 11 +-
9 files changed, 42258 insertions(+), 27762 deletions(-)
- Try it out and report bugs.
- Figure out ways to build notifications, visualizations or another application on top of the shifting data.
- Try forking it and making it go on a Socrata based data site in your city.
- Or just modify it to work off any old public data set.
Create a virtual enviroment to work inside.
$ virtualenv --no-site-packages opendata-la-watchdog
Jump in and turn it on.
$ cd opendata-la-watchdog
$ . bin/activate
Clone the git repository from GitHub.
$ git clone [email protected]:SCPR/opendata-la-watchdog.git repo
Enter the project and install its dependencies.
$ cd repo
$ pip install -r requirements.txt
Run the script to get the latest files.
$ python watchdog.py