Commit graph

20 commits

Author SHA1 Message Date
0dcbe6fc2d Add users and closet_hangers to local db schema 2020-10-22 19:53:32 -07:00
99e6480486 add logging to modeling
my hope is that, if we fuck things up, this will make it clear 😅
2020-10-06 07:06:19 -07:00
df2d814c13 enable running against a local dev database
had to add some missing tables, but it seems to work! (some known errors though, from assumptions we make e.g. blue acaras existing)
2020-10-06 06:18:19 -07:00
41e70ba8d0 finish modeling full pet appearance 2020-09-25 03:29:02 -07:00
50537758c5 start test/dev db IDs at 1, not wherever prod is
We download the schema from prod, and omit real data, but I didn't notice that we were still pulling the metadata of the auto increment counter for IDs! Now, we scrub that from the schema file we save.
2020-09-25 03:29:02 -07:00
5332c9e265 save biology assets on model
and start in comments on pet states :)
2020-09-25 03:29:01 -07:00
ff3fc943d7 modeling saves pet type 2020-09-25 03:29:01 -07:00
9111dfddd3 save item swf assets during modeling 2020-09-25 03:29:01 -07:00
dfeeb9fe0d modeling will save new item data (but not assets)
just a first step!
2020-09-18 07:34:41 -07:00
07691a4e6b add basic test db infra
Boom, now we can also run a clean MySQL test db on each test that wants it :)

the test I wrote as a sample is currently marked `it.skip` because it's not passing yet!
2020-09-18 05:50:17 -07:00
68b7beae1b start setting up a local dev database 2020-09-18 05:24:03 -07:00
12b87ee7d1 set up auth on the server + test utils 2020-09-02 23:00:16 -07:00
4796d213aa oh right, remove LIMIT 1 from export script! 2020-09-02 15:50:33 -07:00
56332ec8c0 redo auth0 export script to use APIs, not JSON 2020-09-02 15:26:33 -07:00
6982f00729 script to export users to auth0 2020-09-02 03:49:58 -07:00
3a6e3fac8e add isCommonlyUsedByItems to Zone
This is in preparation for hiding bio zone restrictions but showing item zone restrictions!

I also refactor the build-cached-data script substantially, to run GraphQL against the server instead of a custom query.
2020-09-01 01:16:30 -07:00
47d22ad25c Build cached zones, stop querying on server
In this change, we cache the zones table as part of the JS build process. This keeps the database as our source of truth, while aggressively caching the data at deploy time.

See the new README for some rationale!

I tested this by pulling up dev Honeycomb, and observing that we no longer run db queries to `zones` in the new traces for the wardrobe page. (It's a good thing we did it this way, because I noticed some code in the server that was still loading the zone anyway, and fixed it here!)
2020-08-19 17:50:05 -07:00
20523a9562 add dump mode to cache-asset-manifests
I was finding the script too slow running on my local machine, because the SQL RTTs were too slow - and with one connection, they were essentially a serial bottleneck, not taking much advantage of our concurrency.

Here, I instead add a `--dump` option, which outputs SQL to stdout. I then uploaded the resulting SQL to the DTI box, and ran it up there. Doing the network part fast on my machine, and the SQL part fast on the cloud machine!

I first considered uploading this script to the cloud machine, but it's an old Ubuntu and I couldn't figure out how to install a recent NodeJS onto it 🙃
2020-08-19 17:19:15 -07:00
4977f1ee54 Revert "cache zone data"
This reverts commit 0f7ab9d10e.

The Production Vercel deploys don't seem to like how I did this build trick, even though the Preview deploys seem fine with it 🤔 Reverting for now, sent a message to Vercel support.
2020-08-17 18:49:37 -07:00
e20364cc0a create cache-asset-manifests script
Just gonna bulk load all those manifests into the db, and then that should make most loads notably faster by removing the net request! 🤞

We'll still load manifests inline sometimes, but only the first time anyone pulls up the layer in impress-2020. After that, it should be cached forever!
2020-08-17 18:23:39 -07:00