📝 document how scraper keys work
parent
caaed8a5e1
commit
a454585d9c
|
@ -121,6 +121,11 @@ pub enum AuditData {
|
||||||
server: crate::config::file::Server,
|
server: crate::config::file::Server,
|
||||||
},
|
},
|
||||||
RelayStart,
|
RelayStart,
|
||||||
|
ScraperGet {
|
||||||
|
key_name: String,
|
||||||
|
server_name: String,
|
||||||
|
uri: String,
|
||||||
|
},
|
||||||
WebClientGet {
|
WebClientGet {
|
||||||
user: Option <String>,
|
user: Option <String>,
|
||||||
server_name: String,
|
server_name: String,
|
||||||
|
|
|
@ -0,0 +1,33 @@
|
||||||
|
# How scraper keys work
|
||||||
|
|
||||||
|
Come up with a random passphrase:
|
||||||
|
|
||||||
|
`not this, this is a bogus passphrase for documentation`
|
||||||
|
|
||||||
|
Run that through the `hash-api-key` subcommand of any `ptth_relay` instance:
|
||||||
|
|
||||||
|
`ptth_relay hash-api-key`
|
||||||
|
|
||||||
|
You'll get a hash like this:
|
||||||
|
|
||||||
|
`RUWt1hQQuHIRjftOdgeZf0PG/DtAmIaMqot/nwBAZXQ=`
|
||||||
|
|
||||||
|
Make sure that gets into the relay's config file, `ptth_relay.toml`:
|
||||||
|
|
||||||
|
```
|
||||||
|
[[scraper_keys]]
|
||||||
|
name = "shudder_mummy"
|
||||||
|
not_before = "2021-08-27T19:20:25-05:00"
|
||||||
|
not_after = "2031-08-27T19:20:25-05:00"
|
||||||
|
hash = "RUWt1hQQuHIRjftOdgeZf0PG/DtAmIaMqot/nwBAZXQ="
|
||||||
|
```
|
||||||
|
|
||||||
|
Use curl to like, try it out:
|
||||||
|
|
||||||
|
```
|
||||||
|
curl \
|
||||||
|
--header "X-ApiKey: not this, this is a bogus passphrase for documentation" \
|
||||||
|
http://localhost:4000/scraper/v1/server/$SERVER_NAME/files/
|
||||||
|
```
|
||||||
|
|
||||||
|
(Replace `$SERVER_NAME` with the name of the server you want to reach. And change the URL so it's not going to localhost.)
|
Loading…
Reference in New Issue