2020-12-11 21:04:59 +00:00
|
|
|
# Auth route for scrapers
|
|
|
|
|
|
|
|
(Find this issue with `git grep YNQAQKJS`)
|
|
|
|
|
2020-12-16 14:46:03 +00:00
|
|
|
## Test curl commands
|
|
|
|
|
2020-12-16 16:33:03 +00:00
|
|
|
Put your API key into a header file, like this:
|
2020-12-16 14:46:03 +00:00
|
|
|
|
2020-12-16 16:33:03 +00:00
|
|
|
```
|
|
|
|
X-ApiKey: bad_password
|
|
|
|
```
|
|
|
|
|
|
|
|
Export the scraper API's URL prefix to an environment variable:
|
|
|
|
|
|
|
|
`export API=http://127.0.0.1:4000/scraper`
|
|
|
|
|
|
|
|
Call it "scraper-secret.txt" or something else obviously secret.
|
|
|
|
Don't check it into Git. The key will expire every 30 days and need
|
|
|
|
to be rotated manually. (for now)
|
|
|
|
|
|
|
|
New versions of Curl can load headers from a text file. All commands
|
|
|
|
will use this feature to load the API key.
|
|
|
|
|
|
|
|
`curl --header @scraper-secret.txt $API/api/test`
|
2020-12-16 14:46:03 +00:00
|
|
|
|
|
|
|
Should return "You're valid!"
|
|
|
|
|
2020-12-16 16:33:03 +00:00
|
|
|
`curl --header @scraper-secret.txt $API/v1/server_list`
|
2020-12-16 14:46:03 +00:00
|
|
|
|
|
|
|
Should return a JSON object listing all the servers.
|
|
|
|
|
2020-12-16 16:33:03 +00:00
|
|
|
`curl --header @scraper-secret.txt $API/v1/server/aliens_wildland/api/v1/dir/`
|
2020-12-16 14:46:03 +00:00
|
|
|
|
|
|
|
Proxies into the "aliens_wildland" server and retrieves a JSON object listing
|
|
|
|
the file server root. (The server must be running a new version of ptth_server
|
|
|
|
which can serve the JSON API)
|
|
|
|
|
2020-12-16 16:33:03 +00:00
|
|
|
`curl --header @scraper-secret.txt $API/v1/server/aliens_wildland/api/v1/dir/src/`
|
2020-12-16 14:46:03 +00:00
|
|
|
|
|
|
|
Same, but retrieves the listing for "/src".
|
|
|
|
|
2020-12-16 16:33:03 +00:00
|
|
|
`curl --header @scraper-secret.txt $API/v1/server/aliens_wildland/files/src/tests.rs`
|
2020-12-16 14:46:03 +00:00
|
|
|
|
|
|
|
There is no special API for retrieving files yet - But the existing server
|
|
|
|
API will be is proxied through the new scraper API on the relay.
|
|
|
|
|
2020-12-16 16:33:03 +00:00
|
|
|
`curl --header @scraper-secret.txt $API/v1/server/aliens_wildland/files/src/tests.rs`
|
2020-12-16 14:46:03 +00:00
|
|
|
|
|
|
|
PTTH supports HEAD requests. This request will yield a "204 No Content", with
|
|
|
|
the "content-length" header.
|
|
|
|
|
2020-12-16 16:33:03 +00:00
|
|
|
`curl --header @scraper-secret.txt -H "range: bytes=100-199" $API/v1/server/aliens_wildland/files/src/tests.rs`
|
2020-12-16 14:46:03 +00:00
|
|
|
|
|
|
|
PTTH supports byte range requests. This request will skip 100 bytes into the
|
|
|
|
file, and read 100 bytes.
|
|
|
|
|
|
|
|
To avoid fence-post errors, most programming languages use half-open ranges.
|
|
|
|
e.g. `0..3` means "0, 1, 2". However, HTTP byte ranges are closed ranges.
|
|
|
|
e.g. `0..3` means "0, 1, 2, 3". So 100-199 means 199 is the last byte retrieved.
|
|
|
|
|
|
|
|
By polling with HEAD and byte range requests, a scraper client can approximate
|
|
|
|
`tail -f` behavior of a server-side file.
|
|
|
|
|
2020-12-11 21:04:59 +00:00
|
|
|
## Problem statement
|
|
|
|
|
|
|
|
PTTH has 2 auth routes:
|
|
|
|
|
|
|
|
- A fixed API key for servers
|
|
|
|
- Whatever the end user puts in front of the HTML client
|
|
|
|
|
|
|
|
"Whatever" is hard for scrapers to deal with. This barrier to scraping
|
|
|
|
is blocking these issues:
|
|
|
|
|
|
|
|
- EOTPXGR3 Remote `tail -f`
|
|
|
|
- UPAQ3ZPT Audit logging of the relay itself
|
|
|
|
- YMFMSV2R Add Prometheus metrics
|
|
|
|
|
|
|
|
## Proposal
|
|
|
|
|
|
|
|
Add a 3rd auth route meeting these criteria:
|
|
|
|
|
|
|
|
- Enabled by a feature flag, disabled by default
|
|
|
|
- Bootstrapped by the user-friendly HTML frontend
|
|
|
|
- Suitable for headless automated scrapers
|
|
|
|
|
|
|
|
It will probably involve an API key like the servers use. Public-key
|
|
|
|
crypto is stronger, but involves more work. I think we should plan to
|
|
|
|
start with something weak, and also plan to deprecate it once something
|
|
|
|
stronger is ready.
|
|
|
|
|
|
|
|
## Proposed impl plan
|
|
|
|
|
2020-12-12 01:26:58 +00:00
|
|
|
- (X) Add feature flags to ptth_relay.toml for dev mode and scrapers
|
2020-12-12 01:53:20 +00:00
|
|
|
- (X) Make sure Docker release CAN build
|
2020-12-12 17:14:10 +00:00
|
|
|
- (X) Add hash of 1 scraper key to ptth_relay.toml, with 1 week expiration
|
2020-12-12 17:50:40 +00:00
|
|
|
- (X) Accept scraper key for some testing endpoint
|
|
|
|
- (X) (POC) Test with curl
|
2020-12-13 01:54:54 +00:00
|
|
|
- (X) Clean up scraper endpoint
|
2020-12-15 05:15:17 +00:00
|
|
|
- (X) Add (almost) end-to-end tests for test scraper endpoint
|
2020-12-16 14:46:03 +00:00
|
|
|
- (X) Thread server endpoints through relay scraper auth
|
2020-12-21 14:19:50 +00:00
|
|
|
- (don't care) Add tests for other scraper endpoints
|
2020-12-15 05:15:17 +00:00
|
|
|
- (don't care) Factor v1 API into v1 module
|
|
|
|
- (X) Add real scraper endpoints
|
2020-12-12 15:10:14 +00:00
|
|
|
- ( ) Manually create SQLite DB for scraper keys, add 1 hash
|
2020-12-12 01:26:58 +00:00
|
|
|
- ( ) Impl DB reads
|
2020-12-12 15:10:14 +00:00
|
|
|
- ( ) Remove scraper key from config file
|
2020-12-12 01:26:58 +00:00
|
|
|
- ( ) Make sure `cargo test` passes and Docker CAN build
|
|
|
|
- ( ) (MVP) Test with curl
|
|
|
|
- ( ) Impl and test DB init / migration
|
|
|
|
- ( ) Impl DB writes (Add / revoke keys) as CLI commands
|
|
|
|
- ( ) Implement API (Behind X-Email auth) for that, test with curl
|
|
|
|
- ( ) Set up mitmproxy or something to add X-Email header in dev env
|
|
|
|
- ( ) Implement web UI (Behind X-Email)
|
2020-12-11 21:04:59 +00:00
|
|
|
|
|
|
|
POC is the proof-of-concept - At this point we will know that in theory the
|
|
|
|
feature can work.
|
|
|
|
|
|
|
|
MVP is the first deployable version - I could put it in prod, manually fudge
|
|
|
|
the SQLite DB to add a 1-month key, and let people start building scrapers.
|
|
|
|
|
|
|
|
Details:
|
|
|
|
|
|
|
|
Dev mode will allow anonymous users to generate scraper keys. In prod mode,
|
|
|
|
(the default) clients will need to have the X-Email header set or use a
|
|
|
|
scraper key to do anything.
|
|
|
|
|
|
|
|
Design the DB so that the servers can share it one day.
|
|
|
|
|
|
|
|
Design the API so that new types of auth / keys can be added one day, and
|
|
|
|
the old ones deprecated.
|
|
|
|
|
2020-12-13 05:04:04 +00:00
|
|
|
Endpoints needed:
|
|
|
|
|
2020-12-14 07:08:00 +00:00
|
|
|
- (X) Query server list
|
2020-12-15 05:15:17 +00:00
|
|
|
- (X) Query directory in server
|
|
|
|
- (not needed) GET file with byte range (identical to frontend file API)
|
2020-12-13 05:04:04 +00:00
|
|
|
|
|
|
|
These will all be JSON for now since Python, Rust, C++, C#, etc. can handle it.
|
|
|
|
For compatibility with wget spidering, I _might_ do XML or HTML that's
|
|
|
|
machine-readable. We'll see.
|
|
|
|
|
2020-12-21 14:19:50 +00:00
|
|
|
## DB / UI impl
|
|
|
|
|
|
|
|
Sprint 1:
|
|
|
|
|
|
|
|
- Look up keys by their hash
|
|
|
|
- not_before
|
|
|
|
- not_after
|
|
|
|
- name
|
|
|
|
- X-Email associated with key
|
|
|
|
|
|
|
|
Sprint 2:
|
|
|
|
|
|
|
|
- UI to generate / revoke keys
|
|
|
|
|
|
|
|
## SQL schema
|
|
|
|
|
|
|
|
Migration
|
|
|
|
|
|
|
|
```
|
|
|
|
create table scraper_keys (
|
|
|
|
hash text primary key, -- Using blake3 for this because it's not a password
|
|
|
|
not_before integer not null, -- Seconds since epoch
|
|
|
|
not_after integer not null, -- Seconds since epoch
|
|
|
|
name text not null, -- Human-friendly nickname
|
|
|
|
email text not null -- Email address that created the key
|
|
|
|
);
|
|
|
|
```
|
|
|
|
|
|
|
|
Look up hash
|
|
|
|
|
|
|
|
```
|
|
|
|
select not_before, not_after name, email
|
|
|
|
from scraper_keys
|
|
|
|
where
|
|
|
|
hash = $1 and
|
|
|
|
strftime ('%s') >= not_before and
|
|
|
|
strftime ('%s') < not_after
|
|
|
|
;
|
|
|
|
```
|
|
|
|
|
|
|
|
Create key
|
|
|
|
|
|
|
|
```
|
|
|
|
-- Generate entropy in app code
|
|
|
|
insert into scraper_keys (
|
|
|
|
hash,
|
|
|
|
not_before,
|
|
|
|
not_after,
|
|
|
|
name,
|
|
|
|
email
|
|
|
|
) values (
|
|
|
|
$1,
|
|
|
|
strftime ('%s'),
|
|
|
|
strftime ('%s') + 2592000,
|
|
|
|
$4,
|
|
|
|
$5
|
|
|
|
);
|
|
|
|
|
|
|
|
-- Respond to client with plaintext key and then forget it.
|
|
|
|
-- If a network blip causes the key to evaporate, the client should revoke it.
|
|
|
|
```
|
|
|
|
|
|
|
|
Revoke key
|
|
|
|
|
|
|
|
```
|
|
|
|
|
|
|
|
```
|
|
|
|
|
2020-12-16 14:46:03 +00:00
|
|
|
## Decision journal
|
2020-12-11 21:04:59 +00:00
|
|
|
|
|
|
|
**Who generates the API key? The scraper client, or the PTTH relay server?**
|
|
|
|
|
|
|
|
The precedent from big cloud vendors seems to be that the server generates
|
|
|
|
tokens. This is probably to avoid a situation where clients with vulnerable
|
|
|
|
crypto code or just bad code generate low-entropy keys. By putting that
|
|
|
|
responsibility on the server, the server can enforce high-entropy keys.
|
|
|
|
|
|
|
|
**Should the key rotate? If so, how?**
|
|
|
|
|
|
|
|
The key should _at least_ expire. If it expires every 30 or 90 days, then a
|
|
|
|
human is slightly inconvenienced to service their scraper regularly.
|
|
|
|
|
|
|
|
When adding other features, we must consider the use cases:
|
|
|
|
|
|
|
|
1. A really dumb Bash script that shells out to curl
|
|
|
|
2. A Python script
|
|
|
|
3. A sophisticated desktop app in C#, Rust, or C++
|
|
|
|
4. Eventually replacing the fixed API keys used in ptth_server
|
|
|
|
|
|
|
|
For the Bash script, rotation will probably be difficult, and I'm okay if
|
|
|
|
our support for that is merely "It'll work for 30 days at a time, then you
|
|
|
|
need to rotate keys manually."
|
|
|
|
|
|
|
|
For the Python script, rotation could be automated, but cryptography is
|
|
|
|
still probably difficult. I think some AWS services require actual crypto
|
|
|
|
keys, and not just high-entropy password keys.
|
|
|
|
|
|
|
|
For the sophisticated desktop app, cryptography is on the table, but this
|
|
|
|
is the least likely use case to ever happen, too.
|