Play with data for weather waifus.
Go to file
gg1234 de66681973 Merge branch 'main' of ssh:// 2023-12-06 01:33:03 -08:00
bin Get path to more reliably 2023-10-03 05:54:27 -07:00
sql Add upcoming au PAY for Ayaka 2023-12-06 01:31:39 -08:00
src Port sql_fn from Perl to Julia 2023-09-26 03:54:55 -07:00
www Link to backup git repo 2023-09-24 19:29:46 -07:00
.gitignore Send file logs to log/ directory 2023-09-23 22:36:29 -07:00 Add a link to the spreadsheet from @smile_hakumai 2023-09-26 05:44:17 -07:00
LICENSE Initial commit 2023-07-18 05:34:50 +00:00
Project.toml Add DaemonMode as a dependency 2023-09-10 22:02:30 -07:00 Update links 2023-11-28 00:07:26 +00:00
crontab Send file logs to log/ directory 2023-09-23 22:36:29 -07:00
crontab-vern Put archive directory in variable WNAR 2023-09-13 15:42:31 -07:00


Play with data for weather waifus.

Learning Julia

I've admired Julia from afar for too long. I want to learn how to use this language, so I started this little project to feel the language out and do something fun.

First Time

To download and compile all the depenencies, use the REPL's package mode.

  • ] (enter package mode)
  • activate .
  • instantiate
  • Hit backspace to go back to the normal Julia mode in the REPL.
cd WeatherNews.jl
julia> ]
(@v1.9) pkg> activate .
  Activating project at `~/WeatherNews.jl`

(WeatherNews) pkg> instantiate

That instantiate may take some time especially on a slower computer. After this is done, you can try playing with the code.

using WeatherNews
using WeatherNews: API, DB
v = API.video_ids()
s = WeatherNews.get_schedule()

using DataFrames
s |> DataFrame



This was the first script that gave birth to this project.

Running from the CLI

julia --project -O0 --compile=min bin/weathernews.jl

Running from the REPL

# ] activate .

Data Collection System

There are a variety of scripts written in Julia, Bash, and Perl for collecting WeatherNews data. An example of how they should be run can be found in the crontab example.

  • bin/wndb-insert.jl :: This script is meant to be run from cron multiple times per day to insert new rows into the schedule table.
  • bin/wndb-video.jl :: This script is meant to be run hourly to keep the video_id of the current schedule item up-to-date.
  • bin/wndb-fix-conflict.jl :: This script is meant to be run throughout the day to detect and fix schedule changes. If the API response for the schedule conflicts with what's in the database, the API response wins and what was in the database gets moved to the cancellation table.
  • bin/ :: This script is for archiving raw JSON responses from the WeatherNews API. I'm archiving the JSON mostly for redundancy. If wndb-insert.jl were to fail, I'd hopefully have JSON data I can use to fix the database after the problem has been fixed.
  • bin/ :: Archive video JSON responses.
  • bin/ :: Archive mscale JSON responses.
  • bin/ :: This script is meant to be run hourly to record the M-scale value from the WeatherNews API.
  • bin/ :: This script scrapes an invidious page for au PAY Market livestreams and inserts new au PAY shows it hasn't seen before. It can be run infrequently like once a week.

Web Site

Under www/ is the source for a web site written in Perl for displaying the WeatherNews data that has been collected by the various scripts.