Log outage geometries to database #68
7 changed files with 21548 additions and 83 deletions
|
@ -2,7 +2,7 @@ when:
|
||||||
branch: main
|
branch: main
|
||||||
steps:
|
steps:
|
||||||
- name: lint
|
- name: lint
|
||||||
image: python:3-alpine
|
image: python:3-slim
|
||||||
commands:
|
commands:
|
||||||
- python -m pip install --upgrade pip
|
- python -m pip install --upgrade pip
|
||||||
- python -m pip install -r requirements.txt
|
- python -m pip install -r requirements.txt
|
||||||
|
|
115
fonts/LICENSE.md
Normal file
115
fonts/LICENSE.md
Normal file
|
@ -0,0 +1,115 @@
|
||||||
|
This repository is a fork of Libre Franklin: https://github.com/impallari/Libre-Franklin
|
||||||
|
|
||||||
|
## License for USWDS’s Modified Version
|
||||||
|
|
||||||
|
This repository contains both the original font software for Libre Franklin (the “Original Version”) and font software modifications made by the General Services Administration (GSA). This repository combines the Original Version and these GSA modifications into a piece of font software called Public Sans, which is a “Modified Version” of Libre Franklin.
|
||||||
|
|
||||||
|
As a work of the United States Government, the font software modifications made by GSA are not subject to copyright within the United States. Additionally, GSA waives copyright and related rights in its font software modifications worldwide through the [CC0 1.0 Universal public domain dedication](https://creativecommons.org/publicdomain/zero/1.0/).
|
||||||
|
|
||||||
|
The Original Version (as defined in the SIL Open Font License, Version 1.1) remains subject to copyright under the SIL Open Font License, Version 1.1.
|
||||||
|
|
||||||
|
This Modified Version (Public Sans) contains both software under the SIL Open Font License, Version 1.1 and software modifications by GSA released as CC0. As a work of the United States Government, the software modifications made by GSA are not subject to copyright within the United States. Additionally, GSA waives copyright and related rights in its software modifications worldwide through the [CC0 1.0 Universal Public Domain Dedication](https://creativecommons.org/publicdomain/zero/1.0/). It is a “joint work” made of the original software and modifications combined into a single work.
|
||||||
|
|
||||||
|
**In practice, users of this Modified Version (Public Sans) should use Public Sans according to the terms of the SIL Open Font License, Version 1.1, below.** This is because this font is a combination of work subject to copyright and work not subject to copyright, so the more restrictive requirements apply to using the combined work.
|
||||||
|
|
||||||
|
## License of project USWDS’s Modified Version is based on
|
||||||
|
|
||||||
|
- Libre Franklin is licensed under the SIL Open Font License, Version 1.1 (<http://scripts.sil.org/OFL>)
|
||||||
|
- To view the copyright and specific terms and conditions of Libre Franklin, please refer to [OFL.txt](https://github.com/impallari/Libre-Franklin/blob/master/OFL.txt)
|
||||||
|
|
||||||
|
## SIL Open Font License, Version 1.1
|
||||||
|
|
||||||
|
Copyright 2015 The Public Sans Project Authors (https://github.com/uswds/public-sans)
|
||||||
|
|
||||||
|
This Font Software is licensed under the SIL Open Font License, Version 1.1.
|
||||||
|
|
||||||
|
This license is copied below, and is also available with a FAQ at http://scripts.sil.org/OFL
|
||||||
|
|
||||||
|
```
|
||||||
|
-----------------------------------------------------------
|
||||||
|
SIL OPEN FONT LICENSE Version 1.1 - 26 February 2007
|
||||||
|
-----------------------------------------------------------
|
||||||
|
|
||||||
|
PREAMBLE
|
||||||
|
The goals of the Open Font License (OFL) are to stimulate worldwide
|
||||||
|
development of collaborative font projects, to support the font creation
|
||||||
|
efforts of academic and linguistic communities, and to provide a free and
|
||||||
|
open framework in which fonts may be shared and improved in partnership
|
||||||
|
with others.
|
||||||
|
|
||||||
|
The OFL allows the licensed fonts to be used, studied, modified and
|
||||||
|
redistributed freely as long as they are not sold by themselves. The
|
||||||
|
fonts, including any derivative works, can be bundled, embedded,
|
||||||
|
redistributed and/or sold with any software provided that any reserved
|
||||||
|
names are not used by derivative works. The fonts and derivatives,
|
||||||
|
however, cannot be released under any other type of license. The
|
||||||
|
requirement for fonts to remain under this license does not apply
|
||||||
|
to any document created using the fonts or their derivatives.
|
||||||
|
|
||||||
|
DEFINITIONS
|
||||||
|
"Font Software" refers to the set of files released by the Copyright
|
||||||
|
Holder(s) under this license and clearly marked as such. This may
|
||||||
|
include source files, build scripts and documentation.
|
||||||
|
|
||||||
|
"Reserved Font Name" refers to any names specified as such after the
|
||||||
|
copyright statement(s).
|
||||||
|
|
||||||
|
"Original Version" refers to the collection of Font Software components as
|
||||||
|
distributed by the Copyright Holder(s).
|
||||||
|
|
||||||
|
"Modified Version" refers to any derivative made by adding to, deleting,
|
||||||
|
or substituting -- in part or in whole -- any of the components of the
|
||||||
|
Original Version, by changing formats or by porting the Font Software to a
|
||||||
|
new environment.
|
||||||
|
|
||||||
|
"Author" refers to any designer, engineer, programmer, technical
|
||||||
|
writer or other person who contributed to the Font Software.
|
||||||
|
|
||||||
|
PERMISSION & CONDITIONS
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining
|
||||||
|
a copy of the Font Software, to use, study, copy, merge, embed, modify,
|
||||||
|
redistribute, and sell modified and unmodified copies of the Font
|
||||||
|
Software, subject to the following conditions:
|
||||||
|
|
||||||
|
1) Neither the Font Software nor any of its individual components,
|
||||||
|
in Original or Modified Versions, may be sold by itself.
|
||||||
|
|
||||||
|
2) Original or Modified Versions of the Font Software may be bundled,
|
||||||
|
redistributed and/or sold with any software, provided that each copy
|
||||||
|
contains the above copyright notice and this license. These can be
|
||||||
|
included either as stand-alone text files, human-readable headers or
|
||||||
|
in the appropriate machine-readable metadata fields within text or
|
||||||
|
binary files as long as those fields can be easily viewed by the user.
|
||||||
|
|
||||||
|
3) No Modified Version of the Font Software may use the Reserved Font
|
||||||
|
Name(s) unless explicit written permission is granted by the corresponding
|
||||||
|
Copyright Holder. This restriction only applies to the primary font name as
|
||||||
|
presented to the users.
|
||||||
|
|
||||||
|
4) The name(s) of the Copyright Holder(s) or the Author(s) of the Font
|
||||||
|
Software shall not be used to promote, endorse or advertise any
|
||||||
|
Modified Version, except to acknowledge the contribution(s) of the
|
||||||
|
Copyright Holder(s) and the Author(s) or with their explicit written
|
||||||
|
permission.
|
||||||
|
|
||||||
|
5) The Font Software, modified or unmodified, in part or in whole,
|
||||||
|
must be distributed entirely under this license, and must not be
|
||||||
|
distributed under any other license. The requirement for fonts to
|
||||||
|
remain under this license does not apply to any document created
|
||||||
|
using the Font Software.
|
||||||
|
|
||||||
|
TERMINATION
|
||||||
|
This license becomes null and void if any of the above conditions are
|
||||||
|
not met.
|
||||||
|
|
||||||
|
DISCLAIMER
|
||||||
|
THE FONT SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||||
|
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO ANY WARRANTIES OF
|
||||||
|
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT
|
||||||
|
OF COPYRIGHT, PATENT, TRADEMARK, OR OTHER RIGHT. IN NO EVENT SHALL THE
|
||||||
|
COPYRIGHT HOLDER BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
|
||||||
|
INCLUDING ANY GENERAL, SPECIAL, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL
|
||||||
|
DAMAGES, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
|
||||||
|
FROM, OUT OF THE USE OR INABILITY TO USE THE FONT SOFTWARE OR FROM
|
||||||
|
OTHER DEALINGS IN THE FONT SOFTWARE.
|
||||||
|
```
|
BIN
fonts/PublicSans-Regular.otf
Normal file
BIN
fonts/PublicSans-Regular.otf
Normal file
Binary file not shown.
11
geospatial.py
Normal file
11
geospatial.py
Normal file
|
@ -0,0 +1,11 @@
|
||||||
|
from shapely import MultiPolygon, Polygon
|
||||||
|
|
||||||
|
|
||||||
|
def convert_outage_geometry(event) -> MultiPolygon:
|
||||||
|
assert event["polygons"]["type"] == "polygon"
|
||||||
|
assert event["polygons"]["hasZ"] is False
|
||||||
|
assert event["polygons"]["hasM"] is False
|
||||||
|
polygon_list = []
|
||||||
|
for ring in event["polygons"]["rings"]:
|
||||||
|
polygon_list.append(Polygon(ring))
|
||||||
|
return MultiPolygon(polygon_list)
|
|
@ -1,20 +1,20 @@
|
||||||
blurhash==1.1.4
|
blurhash==1.1.4
|
||||||
certifi==2024.2.2
|
certifi==2024.8.30
|
||||||
charset-normalizer==3.3.2
|
charset-normalizer==3.4.0
|
||||||
decorator==5.1.1
|
decorator==5.1.1
|
||||||
greenlet==3.0.3
|
greenlet==3.1.1
|
||||||
idna==3.6
|
idna==3.10
|
||||||
install==1.3.5
|
pip-install==1.3.5
|
||||||
Mastodon.py==1.8.1
|
Mastodon.py==1.8.1
|
||||||
numpy==1.26.4
|
numpy==2.1.3
|
||||||
pillow==10.2.0
|
pillow==11.0.0
|
||||||
python-dateutil==2.8.2
|
python-dateutil==2.9.0.post0
|
||||||
python-magic==0.4.27
|
python-magic==0.4.27
|
||||||
PyYAML==6.0.1
|
PyYAML==6.0.2
|
||||||
requests==2.31.0
|
requests==2.32.3
|
||||||
shapely==2.0.3
|
shapely==2.0.6
|
||||||
six==1.16.0
|
six==1.16.0
|
||||||
SQLAlchemy==2.0.27
|
SQLAlchemy==2.0.36
|
||||||
staticmap==0.5.7
|
staticmap==0.5.7
|
||||||
typing_extensions==4.10.0
|
typing_extensions==4.12.2
|
||||||
urllib3==2.2.1
|
urllib3==2.2.3
|
||||||
|
|
21271
sample/burien_large_spread_out.json
Normal file
21271
sample/burien_large_spread_out.json
Normal file
File diff suppressed because it is too large
Load diff
168
scl.py
168
scl.py
|
@ -1,10 +1,10 @@
|
||||||
import io
|
import io
|
||||||
import math
|
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
import mastodon
|
import mastodon
|
||||||
import requests
|
import requests
|
||||||
|
import shapely
|
||||||
import sqlalchemy.types as types
|
import sqlalchemy.types as types
|
||||||
import staticmap
|
import staticmap
|
||||||
import yaml
|
import yaml
|
||||||
|
@ -15,6 +15,8 @@ from sqlalchemy import create_engine, select
|
||||||
from sqlalchemy.exc import NoResultFound
|
from sqlalchemy.exc import NoResultFound
|
||||||
from sqlalchemy.orm import DeclarativeBase, Mapped, Session, mapped_column
|
from sqlalchemy.orm import DeclarativeBase, Mapped, Session, mapped_column
|
||||||
|
|
||||||
|
from geospatial import convert_outage_geometry
|
||||||
|
|
||||||
post_datetime_format = "%b %e %l:%M %p"
|
post_datetime_format = "%b %e %l:%M %p"
|
||||||
|
|
||||||
scl_events_url = "https://utilisocial.io/datacapable/v2/p/scl/map/events"
|
scl_events_url = "https://utilisocial.io/datacapable/v2/p/scl/map/events"
|
||||||
|
@ -45,10 +47,7 @@ class AttribStaticMap(staticmap.StaticMap, object):
|
||||||
super(AttribStaticMap, self)._draw_features(image)
|
super(AttribStaticMap, self)._draw_features(image)
|
||||||
|
|
||||||
txt = Image.new("RGBA", image.size, (255, 255, 255, 0))
|
txt = Image.new("RGBA", image.size, (255, 255, 255, 0))
|
||||||
# get a font
|
fnt = ImageFont.truetype("fonts/PublicSans-Regular.otf", 24)
|
||||||
# fnt = ImageFont.truetype('FreeMono.ttf', 12)
|
|
||||||
fnt = ImageFont.load_default()
|
|
||||||
# get a drawing context
|
|
||||||
d = ImageDraw.Draw(txt)
|
d = ImageDraw.Draw(txt)
|
||||||
|
|
||||||
textSize = fnt.getbbox(self.attribution)
|
textSize = fnt.getbbox(self.attribution)
|
||||||
|
@ -99,8 +98,22 @@ def classify_event_size(num_people: int) -> dict[str, str | bool]:
|
||||||
|
|
||||||
|
|
||||||
def get_hashtag_string(event) -> str:
|
def get_hashtag_string(event) -> str:
|
||||||
hashtag_string = "#SeattleCityLightOutage #SCLOutage #SCLOutage{}".format(
|
city = str()
|
||||||
event["identifier"]
|
try:
|
||||||
|
city = event["geoloc_city"]
|
||||||
|
except KeyError:
|
||||||
|
city = event["city"]
|
||||||
|
|
||||||
|
neighborhood_text = str()
|
||||||
|
try:
|
||||||
|
neighborhood = event["neighborhood"]
|
||||||
|
if neighborhood != city:
|
||||||
|
neighborhood_text = " #{}".format(neighborhood.title().replace(" ", ""))
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
hashtag_string = "#SeattleCityLightOutage #SCLOutage{} #{}".format(
|
||||||
|
neighborhood_text, city.title().replace(" ", "")
|
||||||
)
|
)
|
||||||
return hashtag_string
|
return hashtag_string
|
||||||
|
|
||||||
|
@ -116,7 +129,11 @@ def convert_outage_geometry(event) -> MultiPolygon:
|
||||||
|
|
||||||
|
|
||||||
def do_initial_post(
|
def do_initial_post(
|
||||||
event, event_class, start_time: datetime, estimated_restoration_time: datetime
|
event,
|
||||||
|
event_class,
|
||||||
|
start_time: datetime,
|
||||||
|
estimated_restoration_time: datetime,
|
||||||
|
outage_geometries: shapely.MultiPolygon,
|
||||||
) -> dict[str, str | None]:
|
) -> dict[str, str | None]:
|
||||||
post_id = None
|
post_id = None
|
||||||
map_media_post_id = None
|
map_media_post_id = None
|
||||||
|
@ -124,10 +141,11 @@ def do_initial_post(
|
||||||
area_text = event["city"]
|
area_text = event["city"]
|
||||||
try:
|
try:
|
||||||
map = AttribStaticMap(
|
map = AttribStaticMap(
|
||||||
512,
|
1024,
|
||||||
512,
|
1024,
|
||||||
url_template="https://tiles.stadiamaps.com/tiles/outdoors/{z}/{x}/{y}.png?api_key="
|
url_template="https://tiles.stadiamaps.com/tiles/outdoors/{z}/{x}/{y}@2x.png?api_key="
|
||||||
+ stadiamaps_api_key,
|
+ stadiamaps_api_key,
|
||||||
|
tile_size=512,
|
||||||
)
|
)
|
||||||
assert event["polygons"]["type"] == "polygon"
|
assert event["polygons"]["type"] == "polygon"
|
||||||
for ring in event["polygons"]["rings"]:
|
for ring in event["polygons"]["rings"]:
|
||||||
|
@ -139,30 +157,22 @@ def do_initial_post(
|
||||||
simplify=True,
|
simplify=True,
|
||||||
)
|
)
|
||||||
map.add_polygon(polygon)
|
map.add_polygon(polygon)
|
||||||
map_image = map.render()
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
outage_center: shapely.Point = outage_geometries.centroid
|
||||||
|
|
||||||
def num2deg(xtile, ytile, zoom):
|
assert outage_center.geom_type == "Point"
|
||||||
n = 1 << zoom
|
|
||||||
lon_deg = xtile / n * 360.0 - 180.0
|
|
||||||
lat_rad = math.atan(math.sinh(math.pi * (1 - 2 * ytile / n)))
|
|
||||||
lat_deg = math.degrees(lat_rad)
|
|
||||||
return lat_deg, lon_deg
|
|
||||||
|
|
||||||
center_lat_lon = num2deg(map.x_center, map.y_center, map.zoom)
|
|
||||||
|
|
||||||
# Check to make sure the calculated lat and lon are sane enough
|
# Check to make sure the calculated lat and lon are sane enough
|
||||||
# NW Corner
|
# NW Corner
|
||||||
assert center_lat_lon[0] < 48 and center_lat_lon[1] > -122.6
|
assert outage_center.y < 48 and outage_center.x > -122.6
|
||||||
# SE Corner
|
# SE Corner
|
||||||
assert center_lat_lon[0] > 47.2 and center_lat_lon[1] < -122
|
assert outage_center.y > 47.2 and outage_center.x < -122
|
||||||
|
|
||||||
# Zoom level 17 ensures that we won't get any building/POI names, just street names
|
# Zoom level 17 ensures that we won't get any building/POI names, just street names
|
||||||
geocode_url = "{nominatim_url}/reverse?lat={lat}&lon={lon}&format=geocodejson&zoom=17".format(
|
geocode_url = "{nominatim_url}/reverse?lat={lat}&lon={lon}&format=geocodejson&zoom=17".format(
|
||||||
nominatim_url=nominatim_url,
|
nominatim_url=nominatim_url,
|
||||||
lat=center_lat_lon[0],
|
lat=outage_center.y,
|
||||||
lon=center_lat_lon[1],
|
lon=outage_center.x,
|
||||||
)
|
)
|
||||||
geocode_headers = {"User-Agent": "seattlecitylight-mastodon-bot"}
|
geocode_headers = {"User-Agent": "seattlecitylight-mastodon-bot"}
|
||||||
geocode_response = requests.get(geocode_url, headers=geocode_headers)
|
geocode_response = requests.get(geocode_url, headers=geocode_headers)
|
||||||
|
@ -172,15 +182,15 @@ def do_initial_post(
|
||||||
print("JSON could not be loaded from nominatim API")
|
print("JSON could not be loaded from nominatim API")
|
||||||
raise
|
raise
|
||||||
|
|
||||||
if geocode["features"][0]["properties"]["geocoding"]["city"] != "Seattle":
|
city = geocode["features"][0]["properties"]["geocoding"]["city"]
|
||||||
city_not_seattle_text = " of {}".format(
|
street = geocode["features"][0]["properties"]["geocoding"]["name"]
|
||||||
geocode["features"][0]["properties"]["geocoding"]["city"]
|
event["geoloc_city"] = city
|
||||||
)
|
|
||||||
|
if city != "Seattle":
|
||||||
|
city_not_seattle_text = " of {}".format(city)
|
||||||
else:
|
else:
|
||||||
city_not_seattle_text = ""
|
city_not_seattle_text = ""
|
||||||
|
|
||||||
street = geocode["features"][0]["properties"]["geocoding"]["name"]
|
|
||||||
|
|
||||||
if (
|
if (
|
||||||
"locality" in geocode["features"][0]["properties"]["geocoding"]
|
"locality" in geocode["features"][0]["properties"]["geocoding"]
|
||||||
and event_class["size"] != "Large"
|
and event_class["size"] != "Large"
|
||||||
|
@ -195,30 +205,32 @@ def do_initial_post(
|
||||||
city_not_seattle_text,
|
city_not_seattle_text,
|
||||||
)
|
)
|
||||||
area_text = "the {} area{}".format(locality, city_not_seattle_text)
|
area_text = "the {} area{}".format(locality, city_not_seattle_text)
|
||||||
|
event["neighborhood"] = locality
|
||||||
elif "district" in geocode["features"][0]["properties"]["geocoding"]:
|
elif "district" in geocode["features"][0]["properties"]["geocoding"]:
|
||||||
|
district = geocode["features"][0]["properties"]["geocoding"]["district"]
|
||||||
alt_text = "A map showing the location of the outage, centered around {} in the {} area{}.".format(
|
alt_text = "A map showing the location of the outage, centered around {} in the {} area{}.".format(
|
||||||
street,
|
street,
|
||||||
geocode["features"][0]["properties"]["geocoding"]["district"],
|
district,
|
||||||
city_not_seattle_text,
|
city_not_seattle_text,
|
||||||
)
|
)
|
||||||
area_text = "the {} area{}".format(
|
area_text = "the {} area{}".format(
|
||||||
geocode["features"][0]["properties"]["geocoding"]["district"],
|
district,
|
||||||
city_not_seattle_text,
|
city_not_seattle_text,
|
||||||
)
|
)
|
||||||
|
event["neighborhood"] = district
|
||||||
else:
|
else:
|
||||||
alt_text = "A map showing the location of the outage, centered around {} in {}.".format(
|
alt_text = "A map showing the location of the outage, centered around {} in {}.".format(
|
||||||
street,
|
street,
|
||||||
geocode["features"][0]["properties"]["geocoding"]["city"],
|
city,
|
||||||
)
|
)
|
||||||
area_text = geocode["features"][0]["properties"]["geocoding"]["city"]
|
area_text = city
|
||||||
except Exception:
|
except Exception:
|
||||||
alt_text = "A map showing the location of the outage."
|
alt_text = "A map showing the location of the outage."
|
||||||
|
|
||||||
|
map_image = map.render()
|
||||||
|
|
||||||
with io.BytesIO() as map_image_file:
|
with io.BytesIO() as map_image_file:
|
||||||
map_image.save(map_image_file, format="PNG", optimize=True)
|
map_image.save(map_image_file, format="WebP", method=6)
|
||||||
if __debug__:
|
|
||||||
print("Would have uploaded the map media here")
|
|
||||||
else:
|
|
||||||
map_media_post = mastodon_client.media_post(
|
map_media_post = mastodon_client.media_post(
|
||||||
map_image_file.getvalue(),
|
map_image_file.getvalue(),
|
||||||
mime_type="image/png",
|
mime_type="image/png",
|
||||||
|
@ -234,17 +246,22 @@ def do_initial_post(
|
||||||
map_media_post = None
|
map_media_post = None
|
||||||
hashtag_string = get_hashtag_string(event)
|
hashtag_string = get_hashtag_string(event)
|
||||||
|
|
||||||
|
est_restoration_post_text = str()
|
||||||
|
if estimated_restoration_time > datetime.now():
|
||||||
|
est_restoration_post_text = "\nEst. Restoration: {}\n".format(
|
||||||
|
estimated_restoration_time.strftime(post_datetime_format)
|
||||||
|
)
|
||||||
|
|
||||||
post_text = """Seattle City Light is reporting a {} outage in {}.
|
post_text = """Seattle City Light is reporting a {} outage in {}.
|
||||||
|
|
||||||
Start Date: {}
|
Start Date: {}{}
|
||||||
Est. Restoration: {}
|
|
||||||
Cause: {}
|
Cause: {}
|
||||||
|
|
||||||
{}""".format(
|
{}""".format(
|
||||||
event_class["size"].lower(),
|
event_class["size"].lower(),
|
||||||
area_text,
|
area_text,
|
||||||
start_time.strftime(post_datetime_format),
|
start_time.strftime(post_datetime_format),
|
||||||
estimated_restoration_time.strftime(post_datetime_format),
|
est_restoration_post_text,
|
||||||
event["cause"],
|
event["cause"],
|
||||||
hashtag_string,
|
hashtag_string,
|
||||||
)
|
)
|
||||||
|
@ -298,10 +315,12 @@ class SclOutage(Base):
|
||||||
start_time: Mapped[datetime] = mapped_column()
|
start_time: Mapped[datetime] = mapped_column()
|
||||||
num_people: Mapped[int] = mapped_column()
|
num_people: Mapped[int] = mapped_column()
|
||||||
max_num_people: Mapped[int] = mapped_column()
|
max_num_people: Mapped[int] = mapped_column()
|
||||||
|
neighborhood: Mapped[Optional[str]] = mapped_column()
|
||||||
|
city: Mapped[Optional[str]] = mapped_column()
|
||||||
outage_geometries: Mapped[Geometry] = mapped_column()
|
outage_geometries: Mapped[Geometry] = mapped_column()
|
||||||
|
|
||||||
def __repr__(self) -> str:
|
def __repr__(self) -> str:
|
||||||
return f"SclOutage(scl_outage_id={self.scl_outage_id!r}, most_recent_post_id={self.most_recent_post_id!r}, initial_post_id={self.initial_post_id!r}, map_media_post_id={self.map_media_post_id!r}, last_updated_time={self.last_updated_time!r}, no_longer_in_response_time={self.no_longer_in_response_time!r}), start_time={self.start_time!r}), num_people={self.num_people!r}), max_num_people={self.max_num_people!r}), outage_geometries={self.outage_geometries!r}"
|
return f"SclOutage(scl_outage_id={self.scl_outage_id!r}, most_recent_post_id={self.most_recent_post_id!r}, initial_post_id={self.initial_post_id!r}, map_media_post_id={self.map_media_post_id!r}, last_updated_time={self.last_updated_time!r}, no_longer_in_response_time={self.no_longer_in_response_time!r}, start_time={self.start_time!r}, num_people={self.num_people!r}, max_num_people={self.max_num_people!r}, neighborhood={self.neighborhood!r}, city={self.city!r}, outage_geometries={self.outage_geometries!r})"
|
||||||
|
|
||||||
|
|
||||||
engine = create_engine("sqlite:///scl.db")
|
engine = create_engine("sqlite:///scl.db")
|
||||||
|
@ -327,15 +346,27 @@ with Session(engine) as session:
|
||||||
status = None
|
status = None
|
||||||
|
|
||||||
outage_geometries = convert_outage_geometry(event)
|
outage_geometries = convert_outage_geometry(event)
|
||||||
scl_outage_location = Point(event["latitude"], event["longitude"])
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
hashtag_string = get_hashtag_string(event)
|
hashtag_string = get_hashtag_string(event)
|
||||||
existing_record = lookup_result.one()
|
existing_record = lookup_result.one()
|
||||||
updated_properties = []
|
updated_properties = []
|
||||||
updated_entries = []
|
updated_entries = []
|
||||||
if estimated_restoration_time != existing_record.estimated_restoration_time:
|
|
||||||
|
est_restoration_diff_mins = (
|
||||||
|
abs(
|
||||||
|
(
|
||||||
|
estimated_restoration_time
|
||||||
|
- existing_record.estimated_restoration_time
|
||||||
|
).total_seconds()
|
||||||
|
)
|
||||||
|
/ 60
|
||||||
|
)
|
||||||
|
# Only post if estimated restoration time has changed by 60m or more
|
||||||
|
if est_restoration_diff_mins >= 60:
|
||||||
existing_record.estimated_restoration_time = estimated_restoration_time
|
existing_record.estimated_restoration_time = estimated_restoration_time
|
||||||
|
if estimated_restoration_time > datetime.now():
|
||||||
|
# New estimated restoration time is in the future, so likely to be a real time
|
||||||
updated_properties.append("estimated restoration")
|
updated_properties.append("estimated restoration")
|
||||||
updated_entries.append(
|
updated_entries.append(
|
||||||
"Est. Restoration: {}".format(
|
"Est. Restoration: {}".format(
|
||||||
|
@ -385,6 +416,7 @@ with Session(engine) as session:
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
if max_event_class["is_postable"] and existing_record.initial_post_id:
|
if max_event_class["is_postable"] and existing_record.initial_post_id:
|
||||||
|
try:
|
||||||
post_result = mastodon_client.status_post(
|
post_result = mastodon_client.status_post(
|
||||||
status="\n".join(updated_entries),
|
status="\n".join(updated_entries),
|
||||||
in_reply_to_id=existing_record.most_recent_post_id,
|
in_reply_to_id=existing_record.most_recent_post_id,
|
||||||
|
@ -392,13 +424,33 @@ with Session(engine) as session:
|
||||||
language="en",
|
language="en",
|
||||||
)
|
)
|
||||||
existing_record.most_recent_post_id = post_result["id"]
|
existing_record.most_recent_post_id = post_result["id"]
|
||||||
|
except mastodon.MastodonNotFoundError:
|
||||||
|
print(
|
||||||
|
"Could not post a reply to the existing post, skip this update"
|
||||||
|
)
|
||||||
elif max_event_class["is_postable"]:
|
elif max_event_class["is_postable"]:
|
||||||
print(
|
print(
|
||||||
"Posting an event that grew above the threshold required to post"
|
"Posting an event that grew above the threshold required to post"
|
||||||
)
|
)
|
||||||
initial_post_result = do_initial_post(
|
initial_post_result = do_initial_post(
|
||||||
event, event_class, start_time, estimated_restoration_time
|
event,
|
||||||
|
event_class,
|
||||||
|
start_time,
|
||||||
|
estimated_restoration_time,
|
||||||
|
outage_geometries,
|
||||||
)
|
)
|
||||||
|
try:
|
||||||
|
existing_record.neighborhood = initial_post_result[
|
||||||
|
"neighborhood"
|
||||||
|
]
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
try:
|
||||||
|
existing_record.city = initial_post_result["city"]
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
|
||||||
existing_record.initial_post_id = initial_post_result["post_id"]
|
existing_record.initial_post_id = initial_post_result["post_id"]
|
||||||
existing_record.most_recent_post_id = initial_post_result["post_id"]
|
existing_record.most_recent_post_id = initial_post_result["post_id"]
|
||||||
existing_record.map_media_post_id = initial_post_result[
|
existing_record.map_media_post_id = initial_post_result[
|
||||||
|
@ -412,6 +464,8 @@ with Session(engine) as session:
|
||||||
print("Existing record not found")
|
print("Existing record not found")
|
||||||
post_id = None
|
post_id = None
|
||||||
map_media_post_id = None
|
map_media_post_id = None
|
||||||
|
neighborhood = None
|
||||||
|
city = None
|
||||||
if not event_class["is_postable"]:
|
if not event_class["is_postable"]:
|
||||||
print(
|
print(
|
||||||
"Outage is {} considered postable, will not post".format(
|
"Outage is {} considered postable, will not post".format(
|
||||||
|
@ -420,11 +474,25 @@ with Session(engine) as session:
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
initial_post_result = do_initial_post(
|
initial_post_result = do_initial_post(
|
||||||
event, event_class, start_time, estimated_restoration_time
|
event,
|
||||||
|
event_class,
|
||||||
|
start_time,
|
||||||
|
estimated_restoration_time,
|
||||||
|
outage_geometries,
|
||||||
)
|
)
|
||||||
post_id = initial_post_result["post_id"]
|
post_id = initial_post_result["post_id"]
|
||||||
map_media_post_id = initial_post_result["map_media_post_id"]
|
map_media_post_id = initial_post_result["map_media_post_id"]
|
||||||
|
|
||||||
|
try:
|
||||||
|
neighborhood = initial_post_result["neighborhood"]
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
try:
|
||||||
|
city = initial_post_result["city"]
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
|
||||||
new_outage_record = SclOutage(
|
new_outage_record = SclOutage(
|
||||||
scl_outage_id=event["id"],
|
scl_outage_id=event["id"],
|
||||||
outage_user_id=event["identifier"],
|
outage_user_id=event["identifier"],
|
||||||
|
@ -438,6 +506,8 @@ with Session(engine) as session:
|
||||||
start_time=start_time,
|
start_time=start_time,
|
||||||
num_people=event["numPeople"],
|
num_people=event["numPeople"],
|
||||||
max_num_people=event["numPeople"],
|
max_num_people=event["numPeople"],
|
||||||
|
neighborhood=neighborhood,
|
||||||
|
city=city,
|
||||||
outage_geometries=outage_geometries,
|
outage_geometries=outage_geometries,
|
||||||
)
|
)
|
||||||
session.add(new_outage_record)
|
session.add(new_outage_record)
|
||||||
|
@ -455,9 +525,7 @@ with Session(engine) as session:
|
||||||
if active_outage.most_recent_post_id:
|
if active_outage.most_recent_post_id:
|
||||||
try:
|
try:
|
||||||
post_result = mastodon_client.status_post(
|
post_result = mastodon_client.status_post(
|
||||||
status="This outage is reported to be resolved.\n\n#SeattleCityLightOutage #SCLOutage #SCLOutage{}".format(
|
status="This outage is no longer in the SCL feed, which usually means it's either been resolved, or split into multiple smaller outages.\n\n#SeattleCityLightOutage #SCLOutage",
|
||||||
active_outage.outage_user_id
|
|
||||||
),
|
|
||||||
in_reply_to_id=active_outage.most_recent_post_id,
|
in_reply_to_id=active_outage.most_recent_post_id,
|
||||||
visibility="public",
|
visibility="public",
|
||||||
language="en",
|
language="en",
|
||||||
|
|
Loading…
Reference in a new issue