_
c40abb0fe6
🚧 wip: I was working on some port-forwarding idea.
...
It was going to be generic over TCP and use 2 HTTP streams, one each way.
The plan's written down somewhere.
2021-01-19 23:25:24 +00:00
_
88e7839841
🚧 wip: outlining idea for Redis-like KV store in ptth_server
2021-01-03 18:09:00 +00:00
_
075fd604ee
📝 docs: add idea for Redis-like volatile KV store in the server
2021-01-03 04:45:14 +00:00
_
b62c1424fa
📝 docs: document wget spidering
2020-12-21 22:16:12 -06:00
_
fa070ea7d0
📝 docs: planning auth route
2020-12-21 14:19:50 +00:00
_
0d155a5b36
📝 docs: update todos
2020-12-20 20:52:37 -06:00
_
c70f44f4e4
📝 docs: update todo
2020-12-18 17:03:24 +00:00
Trisha
3bc8323cc8
📝 docs: update example curl commands
2020-12-16 10:33:03 -06:00
_
9ac44cfeb7
⭐ new: finish MVP for scraper auth.
...
Adding a SQLite DB to properly track the keys is going to take a while. For
now I'll just keep them in the config file and give them 30-day expirations.
2020-12-16 14:46:03 +00:00
_
cda627fa4b
⭐ new: add JSON API in server for dir listings
2020-12-15 05:15:17 +00:00
_
cdc890ad89
📝 docs: update scraper auth todo
2020-12-14 01:08:00 -06:00
_
e865ac56c7
🚨 refactor: fix some clippy / cargo check warnings
2020-12-13 20:05:52 -06:00
_
78bffc74c3
📝 docs: plan remaining tasks on scraper API
2020-12-13 05:04:04 +00:00
_
4c52d88be0
📝 docs: check off todo for scraper API
2020-12-13 04:56:43 +00:00
_
670ce30667
✅ test: add end-to-end test for scraper API
2020-12-13 01:55:47 +00:00
_
6d68a77364
⭐ new (ptth_relay): add test endpoint for scrapers
...
Scrapers can auth using a shared (but hashed) API key.
The hash of the key is specified in ptth_relay.toml, and forces dev mode on.
2020-12-12 17:50:40 +00:00
_
6961fde7dc
📝 docs: update plan
2020-12-12 17:14:10 +00:00
_
0eb1e7e38f
⭐ new: add code for scraper keys to expire and have limited durations
2020-12-12 17:11:22 +00:00
_
cc96af6110
📝 docs: improve plan for scraper keys
2020-12-12 15:10:14 +00:00
_
0c5a37b441
🐳 build (ptth_relay): clean up Docker build process
...
The new method is much nicer and doesn't require the manual make-old-git
step. The top-level command is actually build_and_minimize.bash, which uses
`git archive` to unpack the last Git commit and build with _that_ Dockerfile
and Docker context. This is better for determinism. It's similar to our build
process for that one big project at work.
2020-12-12 05:08:58 +00:00
_
f6486b2c1a
🔧 config (ptth_relay): add feature flags
...
- dev mode
- scraper auth
These will gate features I'm adding soon.
2020-12-12 01:26:58 +00:00
_
4014290f98
📝 docs (YNQAQKJS) add plan for 3rd auth route
2020-12-11 21:04:59 +00:00