Commit graph

1487 commits

Author SHA1 Message Date
76476b1a45 Update my desktop SSH key
I did a pretty thorough reset of my desktop machine, and rather than go
spelunking for the same private key, I just rolled it over to a new
one. Let's set it up!
2024-01-14 03:09:20 -08:00
1a0f544f7b Move analytics to analytics.openneo.net
Self-hosted Plausible instance! I have need of usage numbers again,
after a good few years of just not using it; but I don't want to send
the data to Google, and I enjoy self-hosting things, so here we have it!

I actually forgot we were still sending the DTI 2020 numbers to
Plausible's cloud instance, oops! I stopped paying for it 😅 They do a
pretty good job anonymizing, so I'm not too worried about the practical
effects of that, but I'll be pleased to keep it more in-house going
forward.

I'm not sure why `async` isn't in the recommended tag anymore, but fine
by me!
2023-12-04 22:46:14 -08:00
4cd5cf5800 Fix handling of assets with https://images.neopets.com/temp SWF URLs
Weird! Well! This caused two issues.

One is that we used to filter down to only assets whose urls end in
`.swf`, because I never added support for the sound ones :p

The other is that we were using the SWF URL to infer the potential
manifest URLs, which we can't do when the SWF URL is a weird
placeholder!

Thankfully, just yesterday (wow!) we happened to have Classic DTI keep
track of `manifest_url` during the modeling process, and we backfilled
everything. So the most recent `temp` assets have their manifest! But
slightly older ones (not that old tho!!) didn't, so I manually inferred
the manifest URL from the asset's `remote_id` field instead (which
worked cuz there was no hash component to the manifest URL, unlike some
manifests).

The item "Flowing Black Cloak" (86668) on the Zafara is the one we were
looking at and testing with!
2023-11-11 13:28:47 -08:00
c3a9c24d22 Fix pet loading
Needed to add some headers to satisfy StackPath etc!

(Also, latest Prettier ran on this file for the first time, so a few
changes there too!)
2023-11-06 13:24:08 -08:00
74f6d112ee Disable Sentry
I want to restrict our monitoring just to the wardrobe-2020 fork on the
main app now!
2023-11-06 12:30:53 -08:00
0b9cec9c44 Fix some Typescript errors when using latest Next
Ran into these when building for deploy!
2023-11-02 16:50:05 -07:00
3b6bc0b6eb Oops, add Vary: Origin to GraphQL requests
Lol I guess we've probably just been having intermittent CORS issues
forever, oops. I hope the cache has been mostly warmed on the right
thing! But today I checked and the entire species/color picker on the
Rails app wasn't working for CORS reasons, so like. Yeah oof.
2023-11-02 16:35:11 -07:00
23f1f53b64 Upgrade to Next 12.3.2
I didn't really want to do this exactly, but I think installing a global
Typescript binary caused some problems here, where running the app had
started giving me an error like "you need to install `@types/react`!"
and I'm like. I did? years ago?

I searched for the error and people said "oh just upgrade Next" so I did
and now it's fixed I guess? I only bumped up to the latest version of
v12, rather than up to v14 where things are now, to avoid breakages. I
poked around the site ever so slightly and it seems fine, so I'm not
worried about it!
2023-11-02 16:33:33 -07:00
a2594d1797 Fix crash on user lists page from orphaned hangers
If you delete a closet list without deleting the items in it, it'll dump them into your default own/want lists. This isn't necessarily the behavior we really want, but oh well, that's the truth right now!

The GraphQL endpoint was assuming that all hanger `list_id` values were valid, oops! Now, we ignore list IDs that don't match a list.
2023-10-24 14:48:55 -07:00
53deee85dd Use MajorErrorMessage on user lists page
There's an error here that I'm trying to fix, first I'll make the error message nicer for people!
2023-10-24 14:43:43 -07:00
e70e67211e Restart the app every 8 minutes
Idk why it's memory leaking so hard lately? But let's uhhh just reboot it a ton for now, while we continue to work on migrating workloads offa there.
2023-10-17 01:00:23 -07:00
76dd8059bb Extract IMPRESS_MYSQL_HOST environment variable
Oh beans, I made an env variable change that I thought would switch us over to db.impress.openneo.net, and I was just plain wrong, darn. impress-2020 has been fully failing since then, oops!!!

Here, we change it to be an environment variable, so that in the future it will work how I want it to lol
2023-10-12 20:45:06 -07:00
a5dcf369bb Point the update link to the Pardon Our Dust page 2023-10-12 18:13:11 -07:00
a14bc9bebd Fix Vary header for CORS
Oops, we added behavior that varies the CORS response headers according to the incoming `Origin` header, but we forgot to add `Vary: Origin`!

This doesn't cause an issue for the app when you make requests to the server directly, but since it's behind a Fastly cache layer, we ended up caching responses that didn't include CORS headers but should have.

Now, this will instruct the Fastly cache to treat requests with different `Origin` headers as being entirely different. (This means we won't be sharing caches between requests from impress-2020 and the Rails app anymore, but that should be okay in practice!)
2023-10-12 14:58:26 -07:00
e31a79e793 Merge branch 'main' of github.com:openneo/impress-2020 2023-08-29 15:06:43 -07:00
756ee8dcb2 Add workaround for pet lookups with leading digits
This was Dice's idea ty!! Now, instead of crashing when looking up any pet whose name starts with a number, we do a clever lil workaround instead! We don't get *all* the data back (we're missing metadata), but that's fine for the main use case of typing your pet name in at the homepage.
2023-08-29 15:05:19 -07:00
80428e9834 Fix pet loading for most pets
Sigh, I guess xmlrpc was deleted? Back to the method that doesn't work for pets with leading digits in their names, sobbe

Dice has a neat idea for how to work around that, but I'm not sure how to fit it in our architecture, let's take a look!
2023-08-29 14:40:59 -07:00
85d68f68e1 Enable Classic DTI to access impress-2020 via CORS
owo, uwu,
2023-08-20 15:40:21 -07:00
Matchu
17be011ff8 Upgrade to yarn 2
This just kinda, happened on my macOS machine without me really understanding, but it seemed to work? So I won't question it!
2023-07-21 18:30:07 -07:00
571949b066 Increase timeout in archive SQL queries
One problem I run into with the archive task is that sometimes the queries time out? My hunch is that maybe some of the assets have like, weirdly big manifest files that are being transferred as surprisingly big text files?

Anyway, I'm increasing the timeout to 20s, which is big, but big feels good for a script that doesn't run often and where failing is not great news!

I'm also idly considering whether I wanna finally put in the work to do a bulk S3 uploader sometime, because the current version that iterates over multiple `aws s3 cp` calls is just real slow, I think because it establishes a new connection each time, and that operation is maybe surprisingly expensive? And the CLI doesn't have a way to do multiple uploads any more precisely than "sync up this whole folder", which is slow when the folder contains a lot of stuff you _know_ you don't want to head up there.
2023-06-12 10:56:40 -07:00
f54ea61854 Merge branch 'state-of-dti-2022' into main 2023-03-23 21:54:53 -07:00
c28475fcdc Add a big callout link about the State of DTI message 2023-02-04 23:59:18 -08:00
e5e368fb65 More honest language in the homepage "beta" module 2023-02-04 23:49:25 -08:00
4287f12912 Update "State of DTI" message copy & URL 2023-02-04 23:48:55 -08:00
f2832a3bde Logic & comment updates to #30
Tested this out and compared to Dice's other work in Neobot and I think the condition should be the other way around, as it is here? (I found myself starting to write the explanatory comment, and realizing it wasn't making sense, then going heyyy wait a minute lol)
2022-12-22 18:58:49 -08:00
60d2f9a014 Merge branch 'main' of github.com:matchu/impress-2020 into main 2022-12-22 18:49:29 -08:00
82ecf9bbd3
Merge pull request #30 from diceroll123/patch-4
Tweak layer algorithm for bio overrides
2022-12-22 18:49:17 -08:00
d80522ef6d Update SSH key for laptop
I did a factory-reset on this laptop, so the old key is gone forever! Here's the new one!
2022-12-22 18:47:11 -08:00
bf2745f9f1 Oops, remove leftover debug code 2022-12-22 18:44:47 -08:00
722f6b55a5
Tweak layer algorithm for bio overrides 2022-12-12 08:50:38 -05:00
14c6e5a8be State of DTI 2022 letter
not linked anywhere yet, just the page itself created!
2022-12-12 01:46:27 -08:00
5f7a17b244 Fix typo in terms page title 2022-12-12 01:26:10 -08:00
7d64709378 Update public data
Just ran `yarn db:export:public-data`, tada!
2022-12-08 13:34:40 -08:00
d37b7fab2f Add installation guide to README
I tried it in my lil Ubuntu WSL box and hey, it worked great! Neat!!

Pretty neat to have just sat down and fixed up the dev environment for other people?? I'll see about what it would take to actually invite people in…
2022-11-13 09:09:36 -08:00
b84c8ba34e Script to set up dev db with public DTI data
Now, someone with production DB access can run `yarn db:export:public-data` to create `public-data-constants.sql` and `public-data-from-modeling.sql`.

Then, someone setting up their dev database can run `yarn db:setup-dev:full` and get all the wearables data imported right into their dev database!

I'm noticing just how poorly I'm keeping up with my own goals for finishing up DTI, and wondering if now is a good time to circle back to some old offers for code contributions I got last year… I also just figure that making this app Possible To Run with a backup of the basic public database is like. a pretty handy thing to have for archival's sake imo

Note that, for this change, we also set up Git LFS (Large File Storage). Github should be automatically compatible with this! It's a way to not write the whole 30MB database dump into the repository history, and instead keep it in a secondary filestore, because Git's core algorithms aren't really built to handle large blobs of data very well. Users setting up their dev environment will therefore also need to have Git LFS installed for this script to work! (Otherwise, they'll see a "pointer" file in `public-data-from-modeling.sql.gz` that contains some metadata about the file state but not the data itself.)
2022-11-13 07:03:44 -08:00
88511d3dc6 Implement archive:upload:delta
Ok great, we can now run the delta archive process!

It'd be nice to get this running on cron on the impress-2020 server, to a temporary folder? I *do* want to be remembering to run something regularly on my personal machine too though, to keep my own copy up-to-date…
2022-11-05 02:15:31 -07:00
12b5a56694 Fix bug in archive:create scripts
That's what I get for not fully testing lmao! But right, paths in shell scripts are relative to the working directory, and if I want to be relative to the script I gotta use dirname!
2022-10-24 16:56:06 -07:00
22784555d9 Use small pagination toolbar in wardrobe
I thought about making this by media query, but tbh I think this looks better at the larger sizes too?
2022-10-14 20:45:28 -07:00
edce95ec3b Oops, also resize pagination toolbar dropdown 2022-10-14 20:44:57 -07:00
3dfc53cda1 Add basic search results to footer
It looks bad, and only shows page 1, but it does stuff at all! :0
2022-10-14 20:40:06 -07:00
a83e3a9e0b Share search query state between panel and footer 2022-10-14 20:29:57 -07:00
c564400223 Oops, don't mess up mobile with search footer! 2022-10-14 20:26:07 -07:00
589e6c35b4 Hide other search bar when search footer is visible 2022-10-14 20:24:26 -07:00
eb18fd54b6 SearchFooter visual tweaks
Finally playing with this, now that we've been doing paginated search results in the main element! Let's see how it goes 😳

I made a thing to make the pagination toolbar smaller (might want to do that on the mobile view too?), and also to put the search suggestions in a popover floating at the top of the search box.
2022-10-14 20:16:06 -07:00
78d8cf3739 Preload the prev/next item search pages
I tried to do this earlier, but the caching problem from the previous commit (where we weren't including `id` for the search result in the GQL query) was causing it to do a like, infinite loop thing, where the preload results would cache-invalidate the current results, and so the 3 queries would just fight for which one's in the cache?

But now that caching is working, this is working too! Makes it all feel a lot snappier :3
2022-10-14 19:41:52 -07:00
2ee48250da Fix caching for search result pages
Apollo Client is pretty darn reliant on an `id` field for effective caching, more often than you'd think!

Before this change, navigating back to a page you'd already loaded would cause it to reload. After this change, it no longer does, and serves the page from cache instead!
2022-10-14 19:38:26 -07:00
a9aeab12a3 Scroll to top when item search page number changes
We also didn't need the query one, because we now `key` the `SearchResults` by the query, so the container becomes empty-then-full-again, which resets scroll back to top.
2022-10-14 19:33:45 -07:00
45b0aba736 Fix comment 2022-10-14 19:27:05 -07:00
6be5dd74d3 Homepage update: item search pagination 2022-10-14 19:21:55 -07:00
4cdfff03d1 Use pagination instead of infinite scrolling
idk this has been a long-time popular request, so I'm just gonna like. throw it all the way out there. and see what people think of it

I'm a bit worried it might change up the mobile experience too much? But like. let's find out!
2022-10-14 19:19:56 -07:00