Forked from James Kip's Texas Vaccine Updates (thank you James!!
If you would like to build/contribute a crawler, check out the Crawler How To below.
npm install
npm start
npm test
Here is a .env
file that will get you up and running. Just save it as .env
in this project.
NODE_ENV=development
PORT=1919
USER_AGENT="Pennsylvania Vaccine watching bot [your email address here]"
SLACK_WEBHOOK_URL=https://hooks.slack.com/services/BLAHBLAHBLAHBLAHBLAHBLAHBLAHBLAHBLAHBLAHBLAH
You'll need to create an app in Slack to get working webhook URLs.
- Create or # to a Slack workspace where you have permissions to manage apps (TODO: what permissions exactly?)
- In the menu for the Slack workspace in the upper left that opens when you click on the name of the workspace, go to "Administration" then "Manage apps"
- Click on "Build" in the upper right. This goes to https://api.slack.com/apps
- Create a new app, pick a name, and pick the workspace you want to add it to
- Under Settings -> Basic Information, toggle open "Add features and functionality", and click "Incoming webhooks".
- Activate Incoming Webhooks by turning the switch to "on".
- Scroll down and under "Webhook URLs for your Workspace", click the "Add New Webhook to Workspace" button.
- Pick a channel for the bot to post to
- Copy the webhook URL. Now you have a value for your
.env
file.
Visit the Pennsylvania Vaccine Provider Information map, pan and zoom the view to select the area you're interested in.
Open the Attribute Table by clicking the black bar at the bottom of the map. Click Options
and select Export all to CSV
.
Go through each link in the spreadsheet and take a look to see if the facility's website offers online scheduling. This can take some work, but generally I just search for "vaccine" on the page and follow the links.
Try to find the page that the facility will update once appointments are available. Our goal is to visit that page many times until we detect it changes, which signals that appointments may be available.
Make a copy of the example.js crawler. Name it something like pharmacy-name.js
and save it in the crawlers
folder.
Open up the dev tools (I use Chrome) and navigate to the Network
tab. Refresh the website to capture the network requests. We need to find the request that corresponds with the text on screen that will change once the appointments are available. In the example below, Birdsboro Pharmacy prints No Availability
when there are no appointments, so we're going to find the request that returns that text. Try selecting the top request first, then go to the Response
tab in the section on the right side of the dev tools. Hit ctrl
/cmd
+ F
to search the response for the text (No Availability
). If you don't find it, keep searching the other requests' responses until you find it.
Once you've found the request, it's time to copy the request code so that we can insert it into our crawler. Right click on the request, select Copy
-> Copy as Node.js fetch
Paste that Node.js fetch into your text editor, and use the fetch URL as the dataUrl
. You can use the websites main URL as the scheduleURL
, and change the response handler's conditional to test if the No Availability
text exists when we retrieve the response. Don't forget to change the checkExample
function definition and export to match whatever the name of the facility is (e.g. checkBirdsboro
), and finally update index.js
to include your new crawler by adding await checkBirdsboro()
to the main function.