Compare commits

..

55 Commits

Author SHA1 Message Date
Alphonse Paix
be69a54fd1 queries
All checks were successful
Rust / Test (push) Successful in 5m51s
Rust / Rustfmt (push) Successful in 22s
Rust / Clippy (push) Successful in 1m38s
Rust / Code coverage (push) Successful in 5m6s
2025-10-11 00:06:08 +02:00
Alphonse Paix
90aa4f8185 Templates update
Some checks failed
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
Rust / Test (push) Has been cancelled
2025-10-11 00:02:05 +02:00
Alphonse Paix
5d5f9ec765 Database worker
Worker used to clean up pending subscriptions and old idempotency
records
2025-10-11 00:02:05 +02:00
Alphonse Paix
7affe88d50 Queries
All checks were successful
Rust / Test (push) Successful in 5m43s
Rust / Rustfmt (push) Successful in 21s
Rust / Clippy (push) Successful in 1m38s
Rust / Code coverage (push) Successful in 5m3s
2025-10-09 23:48:10 +02:00
Alphonse Paix
e02139ff44 Record login for users
Some checks failed
Rust / Test (push) Failing after 4m54s
Rust / Rustfmt (push) Successful in 21s
Rust / Clippy (push) Failing after 1m36s
Rust / Code coverage (push) Successful in 5m4s
2025-10-09 21:05:48 +02:00
Alphonse Paix
45f529902d Moved logging for task worker inside task execution logic 2025-10-09 19:27:50 +02:00
Alphonse Paix
ef9f860da2 User comments
All checks were successful
Rust / Test (push) Successful in 6m17s
Rust / Rustfmt (push) Successful in 21s
Rust / Clippy (push) Successful in 1m37s
Rust / Code coverage (push) Successful in 5m5s
2025-10-08 14:23:43 +02:00
Alphonse Paix
8a5605812c Posts editing tests
All checks were successful
Rust / Test (push) Successful in 6m17s
Rust / Rustfmt (push) Successful in 24s
Rust / Clippy (push) Successful in 1m35s
Rust / Code coverage (push) Successful in 5m9s
2025-10-08 00:13:56 +02:00
Alphonse Paix
d27196d7e5 Merge branch 'tests' 2025-10-07 23:27:15 +02:00
Alphonse Paix
9cbcdc533e Update templates 2025-10-07 23:25:43 +02:00
Alphonse Paix
f18899b1a6 Update banner message 2025-10-07 23:10:52 +02:00
Alphonse Paix
3bfac6d012 Profile update tests 2025-10-07 23:07:16 +02:00
Alphonse Paix
0b402c6259 Warning banner 2025-10-07 19:43:31 +02:00
Alphonse Paix
8b5f55db6f Edit posts
All checks were successful
Rust / Test (push) Successful in 5m43s
Rust / Rustfmt (push) Successful in 22s
Rust / Clippy (push) Successful in 1m37s
Rust / Code coverage (push) Successful in 4m49s
Use fix routes for user profile edit handles to make it easier when user decides to change his username
2025-10-06 22:33:05 +02:00
Alphonse Paix
b252216709 Edit profile and templates update
All checks were successful
Rust / Test (push) Successful in 6m6s
Rust / Rustfmt (push) Successful in 22s
Rust / Clippy (push) Successful in 1m36s
Rust / Code coverage (push) Successful in 4m47s
2025-10-06 19:23:57 +02:00
Alphonse Paix
da590fb7c6 Templates update
All checks were successful
Rust / Test (push) Successful in 6m12s
Rust / Rustfmt (push) Successful in 21s
Rust / Clippy (push) Successful in 1m36s
Rust / Code coverage (push) Successful in 4m55s
2025-10-06 02:51:52 +02:00
Alphonse Paix
04c2d2b7f5 Test for user system and comments 2025-10-06 02:08:26 +02:00
Alphonse Paix
d96a29ee73 Comment posting is idempotent + tests 2025-10-05 15:01:57 +02:00
Alphonse Paix
8f62c2513e Refreh queries
All checks were successful
Rust / Test (push) Successful in 5m17s
Rust / Rustfmt (push) Successful in 23s
Rust / Clippy (push) Successful in 1m39s
Rust / Code coverage (push) Successful in 4m27s
2025-10-03 21:12:44 +02:00
Alphonse Paix
50a7af2b06 Comments management
Some checks failed
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
Rust / Test (push) Has been cancelled
2025-10-03 21:12:17 +02:00
Alphonse Paix
af9cbdcafb Manage posts on dashboard and templates fixes 2025-10-03 19:18:15 +02:00
Alphonse Paix
ce8c602ddb Posts management widget 2025-10-03 18:30:09 +02:00
Alphonse Paix
9296187181 Queries data
All checks were successful
Rust / Test (push) Successful in 4m55s
Rust / Rustfmt (push) Successful in 23s
Rust / Clippy (push) Successful in 1m39s
Rust / Code coverage (push) Successful in 4m26s
2025-10-02 23:08:48 +02:00
Alphonse Paix
42c1dd9fe3 Templates update
Some checks failed
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
Rust / Test (push) Has been cancelled
2025-10-02 23:05:42 +02:00
Alphonse Paix
96e5dd0f35 Manage users on admin panel
Some checks failed
Rust / Test (push) Failing after 4m18s
Rust / Rustfmt (push) Successful in 22s
Rust / Clippy (push) Failing after 1m39s
Rust / Code coverage (push) Successful in 4m25s
2025-10-02 22:13:02 +02:00
Alphonse Paix
91e80b4881 update workflow
All checks were successful
Rust / Test (push) Successful in 4m19s
Rust / Rustfmt (push) Successful in 23s
Rust / Clippy (push) Successful in 1m42s
Rust / Code coverage (push) Successful in 4m58s
2025-10-02 10:59:18 +02:00
Alphonse Paix
9e5d185aaf Support for comments
Some checks failed
Rust / Test (push) Successful in 5m31s
Rust / Rustfmt (push) Successful in 22s
Rust / Clippy (push) Failing after 27s
Rust / Code coverage (push) Successful in 4m28s
2025-10-02 00:50:01 +02:00
Alphonse Paix
2c7282475f Update workflow
Some checks failed
Rust / Test (push) Successful in 4m24s
Rust / Rustfmt (push) Successful in 23s
Rust / Clippy (push) Failing after 29s
Rust / Code coverage (push) Successful in 4m35s
2025-10-01 01:45:12 +02:00
Alphonse Paix
402c560354 User profile and admin privileges 2025-10-01 01:17:59 +02:00
Alphonse Paix
3e81c27ab3 Administrator privileges to get and delete subscribers
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-30 18:28:04 +02:00
Alphonse Paix
b5b00152cd Redis config
All checks were successful
Rust / Test (push) Successful in 3m44s
Rust / Rustfmt (push) Successful in 23s
Rust / Clippy (push) Successful in 1m13s
Rust / Code coverage (push) Successful in 3m47s
2025-09-30 02:24:56 +02:00
Alphonse Paix
22c462fba3 tests for new post notifications and dashboard stats
All checks were successful
Rust / Test (push) Successful in 3m47s
Rust / Rustfmt (push) Successful in 21s
Rust / Clippy (push) Successful in 1m14s
Rust / Code coverage (push) Successful in 3m49s
2025-09-29 18:22:15 +02:00
Alphonse Paix
de44564ba0 Templates and TLS requests
All checks were successful
Rust / Test (push) Successful in 5m34s
Rust / Rustfmt (push) Successful in 22s
Rust / Clippy (push) Successful in 1m13s
Rust / Code coverage (push) Successful in 3m33s
Refactored HTML templates and added TLS back to issue HTTP requests
2025-09-29 02:39:53 +02:00
Alphonse Paix
3b727269c5 deleted cargo config and fmt
All checks were successful
Rust / Test (push) Successful in 3m40s
Rust / Rustfmt (push) Successful in 25s
Rust / Clippy (push) Successful in 1m18s
Rust / Code coverage (push) Successful in 3m44s
2025-09-28 21:05:23 +02:00
Alphonse Paix
271aa87b9e Update workflow
Some checks failed
Rust / Test (push) Failing after 1m26s
Rust / Rustfmt (push) Failing after 22s
Rust / Clippy (push) Failing after 27s
Rust / Code coverage (push) Failing after 1m47s
2025-09-28 16:57:06 +02:00
Alphonse Paix
34463d92fc update Dockerfile config build issue when running on VPS
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-28 15:25:35 +02:00
Alphonse Paix
c58dfaf647 Remove unnecessary axum_server dependency
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-28 14:58:15 +02:00
Alphonse Paix
b629a8e2fb dockerfile and sqlx queries data
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-28 04:29:52 +02:00
Alphonse Paix
2de3f8dcf7 update dashboard logout button to issue get request
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-28 03:43:26 +02:00
Alphonse Paix
1117d49746 Update telemetry
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-28 03:37:23 +02:00
Alphonse Paix
ac96b3c249 spans in logs 2025-09-27 19:59:04 +02:00
Alphonse Paix
87c529ecb6 tracing output 2025-09-27 12:55:20 +02:00
Alphonse Paix
f43e143bf6 custom extractors rejection
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-27 02:28:04 +02:00
Alphonse Paix
f9ae3f42a6 More tests, not found page and dashboard fixes
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
When post was deleted, website shows a 404 page insead of an 500 page.
Also made the dashboard empty page message more explicit.
2025-09-26 20:31:30 +02:00
Alphonse Paix
0f6b479af9 Dashboard subscribers widget
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-26 01:54:48 +02:00
Alphonse Paix
4cb1d2b6fd Compute dashboard stats
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
Track open rate for new post notifications (user clicked the button in
the link or not). No data about the user is collected during the
process, it only uses an ID inserted by the issue delivery worker.
2025-09-24 04:30:27 +02:00
Alphonse Paix
33281132c6 Update test suite to drop database automatically when test is successfull
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-24 02:55:18 +02:00
Alphonse Paix
9ea539e5cc cargo sqlx prepare
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-24 00:07:17 +02:00
Alphonse Paix
165fc1bd70 small front changes
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-24 00:06:35 +02:00
Alphonse Paix
3153b99d94 Loaders
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-23 23:47:30 +02:00
Alphonse Paix
b1e315921e Load more button on Posts page
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-23 18:24:12 +02:00
Alphonse Paix
5c5e3b0e4c post card fragment 2025-09-23 16:09:25 +02:00
Alphonse Paix
03ca17fdb5 favicon 2025-09-23 16:09:15 +02:00
Alphonse Paix
b00129bca4 Reduce width of post view page and removed placeholders in dashboard
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-23 02:45:44 +02:00
Alphonse Paix
bcb5ada8ef Frontend fixes
Some checks failed
Rust / Test (push) Has been cancelled
Rust / Rustfmt (push) Has been cancelled
Rust / Clippy (push) Has been cancelled
Rust / Code coverage (push) Has been cancelled
2025-09-23 02:40:34 +02:00
154 changed files with 6465 additions and 1559 deletions

View File

@@ -1,3 +1,3 @@
[target.x86_64-unknown-linux-gnu]
linker = "clang"
rustflags = ["-C", "link-arg=-fuse-ld=/usr/bin/mold"]
rustflags = ["-C", "link-arg=-fuse-ld=mold"]

View File

@@ -4,3 +4,9 @@
Dockerfile
/scripts
/migrations
/node_modules
/assets/css/main.css
/.github
README.md
/tests
/configuration/local.yaml

View File

@@ -1,11 +1,6 @@
# The name of your workflow. GitHub displays the names of your workflows on your repository's "Actions" tab
name: Rust
# To automatically trigger the workflow
on:
# NB: this differs from the book's project!
# These settings allow us to run this specific CI pipeline for PRs against
# this specific branch (a.k.a. book chapter).
push:
branches:
- main
@@ -18,74 +13,51 @@ env:
CARGO_TERM_COLOR: always
SQLX_VERSION: 0.8.6
SQLX_FEATURES: "rustls,postgres"
APP_USER: app
APP_USER_PWD: secret
APP_DB_NAME: newsletter
DATABASE_URL: postgres://postgres:password@postgres:5432/newsletter
APP_DATABASE__HOST: postgres
APP_KV_STORE__HOST: redis
# A workflow run is made up of one or more jobs, which run in parallel by default
# Each job runs in a runner environment specified by runs-on
jobs:
# Unique identifier of our job (`job_id`)
test:
# Sets the name `Test` for the job, which is displayed in the GitHub UI
name: Test
# Containers must run in Linux based operating systems
runs-on: ubuntu-latest
# Service containers to run alongside the `test` container job
services:
# Label used to access the service container
postgres:
# Docker Hub image
image: postgres
# Environment variables scoped only for the `postgres` element
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: password
POSTGRES_DB: postgres
# When you map ports using the ports keyword, GitHub uses the --publish command to publish the containers ports to the Docker host
# Opens tcp port 5432 on the host and service container
POSTGRES_DB: newsletter
ports:
- 5432:5432
- 15432:5432
redis:
image: redis:7
image: redis
ports:
- 6379:6379
- 16379:6379
steps:
# Downloads a copy of the code in your repository before running CI tests
- name: Check out repository code
# The uses keyword specifies that this step will run v4 of the actions/checkout action.
# This is an action that checks out your repository onto the runner, allowing you to run scripts or other actions against your code (such as build and test tools).
# You should use the checkout action any time your workflow will run against the repository's code.
uses: actions/checkout@v4
# This GitHub Action installs a Rust toolchain using rustup. It is designed for one-line concise usage and good defaults.
# It also takes care of caching intermediate build artifacts.
- name: Install mold linker
run: |
sudo apt-get update
sudo apt-get install -y mold clang
- name: Install the Rust toolchain
uses: actions-rust-lang/setup-rust-toolchain@v1
with:
cache: false
- name: Install sqlx-cli
run: cargo install sqlx-cli
--version=${{ env.SQLX_VERSION }}
--features ${{ env.SQLX_FEATURES }}
--no-default-features
--locked
- name: Create app user in Postgres
run: |
sudo apt-get install postgresql-client
# Create the application user
CREATE_QUERY="CREATE USER ${APP_USER} WITH PASSWORD '${APP_USER_PWD}';"
PGPASSWORD="password" psql -U "postgres" -h "localhost" -c "${CREATE_QUERY}"
# Grant create db privileges to the app user
GRANT_QUERY="ALTER USER ${APP_USER} CREATEDB;"
PGPASSWORD="password" psql -U "postgres" -h "localhost" -c "${GRANT_QUERY}"
- name: Migrate database
run: SKIP_DOCKER=true ./scripts/init_db.sh
run: cargo sqlx migrate run
- name: Run tests
run: cargo test
run: TEST_LOG=true cargo test
- name: Check that queries are fresh
run: cargo sqlx prepare --check --workspace
# `fmt` container job
fmt:
name: Rustfmt
runs-on: ubuntu-latest
@@ -95,31 +67,30 @@ jobs:
uses: actions-rust-lang/setup-rust-toolchain@v1
with:
components: rustfmt
cache: false
- name: Enforce formatting
run: cargo fmt --check
# `clippy` container job
clippy:
name: Clippy
runs-on: ubuntu-latest
env:
# This environment variable forces sqlx to use its offline mode,
# which means that it will not attempt to connect to a database
# when running the tests. It'll instead use the cached query results.
# We check that the cached query results are up-to-date in another job,
# to speed up the overall CI pipeline.
# This will all be covered in detail in chapter 5.
SQLX_OFFLINE: true
steps:
- uses: actions/checkout@v4
- name: Check out repository code
uses: actions/checkout@v4
- name: Install mold linker
run: |
sudo apt-get update
sudo apt-get install -y mold clang
- name: Install the Rust toolchain
uses: actions-rust-lang/setup-rust-toolchain@v1
with:
components: clippy
cache: false
- name: Linting
run: cargo clippy -- -D warnings
# `coverage` container job
coverage:
name: Code coverage
runs-on: ubuntu-latest
@@ -129,45 +100,35 @@ jobs:
env:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: password
POSTGRES_DB: postgres
POSTGRES_DB: newsletter
ports:
- 5432:5432
- 15432:5432
redis:
image: redis:7
image: redis
ports:
- 6379:6379
- 16379:6379
steps:
- uses: actions/checkout@v4
- name: Install mold linker
run: |
sudo apt-get update
sudo apt-get install -y mold clang
- name: Install the Rust toolchain
uses: actions-rust-lang/setup-rust-toolchain@v1
with:
components: llvm-tools-preview
cache: false
- name: Install sqlx-cli
run: cargo install sqlx-cli
--version=${{ env.SQLX_VERSION }}
--features ${{ env.SQLX_FEATURES }}
--no-default-features
--locked
- name: Create app user in Postgres
run: |
sudo apt-get install postgresql-client
# Create the application user
CREATE_QUERY="CREATE USER ${APP_USER} WITH PASSWORD '${APP_USER_PWD}';"
PGPASSWORD="password" psql -U "postgres" -h "localhost" -c "${CREATE_QUERY}"
# Grant create db privileges to the app user
GRANT_QUERY="ALTER USER ${APP_USER} CREATEDB;"
PGPASSWORD="password" psql -U "postgres" -h "localhost" -c "${GRANT_QUERY}"
- name: Migrate database
run: SKIP_DOCKER=true ./scripts/init_db.sh
run: cargo sqlx migrate run
- name: Install cargo-llvm-cov
uses: taiki-e/install-action@cargo-llvm-cov
- name: Generate code coverage
run: cargo llvm-cov --all-features --workspace --lcov --output-path lcov.info
- name: Generate report
run: cargo llvm-cov report --html --output-dir coverage
- uses: actions/upload-artifact@v4
with:
name: "Coverage report"
path: coverage/

3
.gitignore vendored
View File

@@ -1,3 +1,6 @@
/target
/node_modules
.env
/.idea
docker-compose.yml

View File

@@ -0,0 +1,18 @@
{
"db_name": "PostgreSQL",
"query": "\n INSERT INTO comments (user_id, comment_id, post_id, author, content)\n VALUES ($1, $2, $3, $4, $5)\n ",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid",
"Uuid",
"Uuid",
"Text",
"Text"
]
},
"nullable": []
},
"hash": "02fff619c0ff8cb4f9946991be0ce795385b9e6697dcaa52f915acdbb1460e65"
}

View File

@@ -0,0 +1,64 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT p.post_id, p.author_id, u.username AS author, u.full_name,\n p.title, p.content, p.published_at, last_modified\n FROM posts p\n LEFT JOIN users u ON p.author_id = u.user_id\n WHERE p.post_id = $1\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "post_id",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "author_id",
"type_info": "Uuid"
},
{
"ordinal": 2,
"name": "author",
"type_info": "Text"
},
{
"ordinal": 3,
"name": "full_name",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "content",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "published_at",
"type_info": "Timestamptz"
},
{
"ordinal": 7,
"name": "last_modified",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": [
false,
false,
false,
true,
false,
false,
false,
true
]
},
"hash": "059162eba48cf5f519d0d8b6ce63575ced91941b8c55c986b8c5591c7d9b09e4"
}

View File

@@ -0,0 +1,20 @@
{
"db_name": "PostgreSQL",
"query": "SELECT count(*) FROM notifications_delivered WHERE opened = TRUE",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": []
},
"nullable": [
null
]
},
"hash": "06f07a7522f3ee8e2cdfe5a7988a46f9a2598aa9c0618d00f6287978d5ce28ca"
}

View File

@@ -0,0 +1,12 @@
{
"db_name": "PostgreSQL",
"query": "\n DELETE FROM idempotency\n WHERE created_at < NOW() - INTERVAL '1 hour'\n ",
"describe": {
"columns": [],
"parameters": {
"Left": []
},
"nullable": []
},
"hash": "1e1a90042e89bd8662df3bae15bc7506146cff102034664c77ab0fc68b9480f5"
}

View File

@@ -0,0 +1,64 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT p.author_id, u.username as author, u.full_name,\n p.post_id, p.title, p.content, p.published_at, p.last_modified\n FROM posts p\n INNER JOIN users u ON p.author_id = u.user_id\n WHERE p.author_id = $1\n ORDER BY p.published_at DESC\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "author_id",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "author",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "full_name",
"type_info": "Text"
},
{
"ordinal": 3,
"name": "post_id",
"type_info": "Uuid"
},
{
"ordinal": 4,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "content",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "published_at",
"type_info": "Timestamptz"
},
{
"ordinal": 7,
"name": "last_modified",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": [
false,
false,
true,
false,
false,
false,
false,
true
]
},
"hash": "1fc92c14786c21d24951341e3a8149964533b7627d2d073eeac7b7d3230513ce"
}

View File

@@ -0,0 +1,44 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT user_id, password_hash, role as \"role: Role\"\n FROM users\n WHERE username = $1\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "user_id",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "password_hash",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "role: Role",
"type_info": {
"Custom": {
"name": "user_role",
"kind": {
"Enum": [
"admin",
"writer"
]
}
}
}
}
],
"parameters": {
"Left": [
"Text"
]
},
"nullable": [
false,
false,
false
]
},
"hash": "22c9449522dcf495d9f49c16ca433aa07a0d1daae4884789ba1e36a918e7dfd1"
}

View File

@@ -0,0 +1,14 @@
{
"db_name": "PostgreSQL",
"query": "\n INSERT INTO idempotency (idempotency_key, created_at)\n VALUES ($1, now())\n ON CONFLICT DO NOTHING\n ",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Text"
]
},
"nullable": []
},
"hash": "3124db53d9e1fe0701a2fc70eea98e001fef4b75c24d33d8dd595f6b483e8f65"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT newsletter_issue_id, subscriber_email, unsubscribe_token\n FROM issue_delivery_queue\n FOR UPDATE\n SKIP LOCKED\n LIMIT 1\n ",
"query": "\n SELECT newsletter_issue_id, subscriber_email, unsubscribe_token, kind\n FROM issue_delivery_queue\n FOR UPDATE\n SKIP LOCKED\n LIMIT 1\n ",
"describe": {
"columns": [
{
@@ -17,16 +17,22 @@
"ordinal": 2,
"name": "unsubscribe_token",
"type_info": "Text"
},
{
"ordinal": 3,
"name": "kind",
"type_info": "Text"
}
],
"parameters": {
"Left": []
},
"nullable": [
false,
false,
false,
false
]
},
"hash": "6d21a0dd6ef2ea03ce82248ceceab76bb486237ff8e4a2ccd4dbf2b73c496048"
"hash": "3b79eca713fe7e167578537399436f5cb1171a7e89c398e005ad41ee12aaf91f"
}

View File

@@ -0,0 +1,20 @@
{
"db_name": "PostgreSQL",
"query": "SELECT count(*) FROM posts",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": []
},
"nullable": [
null
]
},
"hash": "3d7376ca79ffd159830fc6d43042d5fe761b6d330924bde7c5fc0f17f533def9"
}

View File

@@ -0,0 +1,20 @@
{
"db_name": "PostgreSQL",
"query": "SELECT count(*) FROM notifications_delivered",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": []
},
"nullable": [
null
]
},
"hash": "3f4aceeab03c1c7352d6bed39d397e17d1fc934015d53754f9b0055c4701ee21"
}

View File

@@ -1,15 +0,0 @@
{
"db_name": "PostgreSQL",
"query": "\n INSERT INTO idempotency (user_id, idempotency_key, created_at)\n VALUES ($1, $2, now())\n ON CONFLICT DO NOTHING\n ",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid",
"Text"
]
},
"nullable": []
},
"hash": "409cb2c83e34fba77b76f031cb0846a8f2716d775c3748887fb0c50f0e0a565b"
}

View File

@@ -0,0 +1,22 @@
{
"db_name": "PostgreSQL",
"query": "DELETE FROM subscriptions WHERE id = $1 RETURNING email",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "email",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": [
false
]
},
"hash": "4141df8c45db179016d8e87b023b572bec7e04a6f3324aa17de7e7a9b1fb32ef"
}

View File

@@ -0,0 +1,15 @@
{
"db_name": "PostgreSQL",
"query": "\n INSERT INTO issue_delivery_queue (\n newsletter_issue_id,\n subscriber_email,\n unsubscribe_token,\n kind\n )\n SELECT $1, email, unsubscribe_token, $2\n FROM subscriptions\n WHERE status = 'confirmed'\n ",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid",
"Text"
]
},
"nullable": []
},
"hash": "5d9039a01feaca50218a1c791439b2bd3817582798027c00d59d43089531ecc0"
}

View File

@@ -0,0 +1,62 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT user_id, username, role as \"role: Role\", full_name, bio, member_since\n FROM users\n WHERE user_id = $1\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "user_id",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "username",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "role: Role",
"type_info": {
"Custom": {
"name": "user_role",
"kind": {
"Enum": [
"admin",
"writer"
]
}
}
}
},
{
"ordinal": 3,
"name": "full_name",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "bio",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "member_since",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": [
false,
false,
false,
true,
true,
false
]
},
"hash": "601884180bc841dc0762008a819218620fc05169fe3bb80b7635fbe9e227056b"
}

View File

@@ -0,0 +1,20 @@
{
"db_name": "PostgreSQL",
"query": "SELECT count(*) FROM subscriptions",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": []
},
"nullable": [
null
]
},
"hash": "68a00cae18e40dc76ffea61dfc0ea84d8cb09502b24c11dbb8d403419899dfd1"
}

View File

@@ -0,0 +1,60 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT user_id, username, role as \"role: Role\", full_name, bio, member_since\n FROM users\n ORDER BY member_since DESC\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "user_id",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "username",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "role: Role",
"type_info": {
"Custom": {
"name": "user_role",
"kind": {
"Enum": [
"admin",
"writer"
]
}
}
}
},
{
"ordinal": 3,
"name": "full_name",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "bio",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "member_since",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": []
},
"nullable": [
false,
false,
false,
true,
true,
false
]
},
"hash": "73dbf3fb780272b1849cd8aa2ecfb59774b1c46bf52181b6298eebccbc86e438"
}

View File

@@ -1,6 +1,6 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT\n response_status_code as \"response_status_code!\",\n response_headers as \"response_headers!: Vec<HeaderPairRecord>\",\n response_body as \"response_body!\"\n FROM idempotency\n WHERE\n user_id = $1\n AND idempotency_key = $2\n ",
"query": "\n SELECT\n response_status_code as \"response_status_code!\",\n response_headers as \"response_headers!: Vec<HeaderPairRecord>\",\n response_body as \"response_body!\"\n FROM idempotency\n WHERE idempotency_key = $1\n ",
"describe": {
"columns": [
{
@@ -44,7 +44,6 @@
],
"parameters": {
"Left": [
"Uuid",
"Text"
]
},
@@ -54,5 +53,5 @@
true
]
},
"hash": "1fc498c8ccbf46f3e00b915e3b3973eb8d44a83a7df6dd7744dc56a2e94a0aa5"
"hash": "74d92b078198c3f73edc272c788249b14b62c59365d745d6a2e314cd9c5db1e9"
}

View File

@@ -0,0 +1,22 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT login_time FROM user_logins\n WHERE user_id = $1\n ORDER BY login_time DESC\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "login_time",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": [
false
]
},
"hash": "769e8762bd2173c088d85fc132326b05a08e67092eac4c3a7aff8a49d086b5a0"
}

View File

@@ -0,0 +1,12 @@
{
"db_name": "PostgreSQL",
"query": "\n DELETE FROM subscriptions\n WHERE status = 'pending_confirmation'\n AND subscribed_at < NOW() - INTERVAL '24 hours'\n ",
"describe": {
"columns": [],
"parameters": {
"Left": []
},
"nullable": []
},
"hash": "7eccf0027753bc1c42897aef12c9350eca023f3be52e24530127d06c3c449104"
}

View File

@@ -0,0 +1,22 @@
{
"db_name": "PostgreSQL",
"query": "SELECT author_id FROM posts WHERE post_id = $1",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "author_id",
"type_info": "Uuid"
}
],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": [
false
]
},
"hash": "84fcada696e1be5db55ef276e120ffef9adf7f5a4f5c4d5975b85e008e15620b"
}

View File

@@ -0,0 +1,27 @@
{
"db_name": "PostgreSQL",
"query": "\n INSERT INTO users (user_id, username, password_hash, role)\n VALUES ($1, $2, $3, $4)\n ",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid",
"Text",
"Text",
{
"Custom": {
"name": "user_role",
"kind": {
"Enum": [
"admin",
"writer"
]
}
}
}
]
},
"nullable": []
},
"hash": "878036fa48e738387e4140d5dc7eccba477794a267f2952aab684028b7c6e286"
}

View File

@@ -0,0 +1,59 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT c.user_id as \"user_id?\", u.username as \"username?\", c.comment_id, c.post_id, c.author, c.content, c.published_at\n FROM comments c\n LEFT JOIN users u ON c.user_id = u.user_id AND c.user_id IS NOT NULL\n ORDER BY published_at DESC\n LIMIT $1\n OFFSET $2\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "user_id?",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "username?",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "comment_id",
"type_info": "Uuid"
},
{
"ordinal": 3,
"name": "post_id",
"type_info": "Uuid"
},
{
"ordinal": 4,
"name": "author",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "content",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "published_at",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Int8",
"Int8"
]
},
"nullable": [
true,
false,
false,
false,
true,
false,
false
]
},
"hash": "886de678764ebf7f96fe683d3b685d176f0a41043c7ade8b659a9bd167a2d063"
}

View File

@@ -0,0 +1,17 @@
{
"db_name": "PostgreSQL",
"query": "\n UPDATE users\n SET username = $1, full_name = $2, bio = $3\n WHERE user_id = $4\n ",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Text",
"Text",
"Text",
"Uuid"
]
},
"nullable": []
},
"hash": "8dc27ae224c7ae3c99c396302357514d66e843dc4b3ee4ab58c628b6c9797fdd"
}

View File

@@ -0,0 +1,20 @@
{
"db_name": "PostgreSQL",
"query": "SELECT count(*) FROM subscriptions WHERE status = 'confirmed'",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": []
},
"nullable": [
null
]
},
"hash": "95a6533f617e7bae589b00548c73425b2991237b8c823dd7c863e6dad002d4b6"
}

View File

@@ -0,0 +1,14 @@
{
"db_name": "PostgreSQL",
"query": "UPDATE notifications_delivered SET opened = TRUE WHERE email_id = $1",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": []
},
"hash": "9fc831553927814e21dd2aa4ff92d06c32e318c7536918d5adbaf5eaf5777e3d"
}

View File

@@ -1,36 +1,37 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT p.post_id, u.username AS author, p.title, p.content, p.published_at\n FROM posts p\n LEFT JOIN users u ON p.author_id = u.user_id\n ORDER BY p.published_at DESC\n LIMIT $1\n ",
"query": "SELECT * FROM subscriptions ORDER BY subscribed_at DESC LIMIT $1 OFFSET $2",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "post_id",
"name": "id",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "author",
"name": "email",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "title",
"type_info": "Text"
"name": "subscribed_at",
"type_info": "Timestamptz"
},
{
"ordinal": 3,
"name": "content",
"name": "status",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "published_at",
"type_info": "Timestamptz"
"name": "unsubscribe_token",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Int8",
"Int8"
]
},
@@ -39,8 +40,8 @@
false,
false,
false,
false
true
]
},
"hash": "9ba5df2593c5dc21de727c16f03a76e4922b940c0877132cd5f622c725b9b123"
"hash": "a6cb227efa5ac12189e662d68b8dcc39032f308f211f603dfcf539b7b071b8e3"
}

View File

@@ -1,28 +0,0 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT user_id, password_hash\n FROM users\n WHERE username = $1\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "user_id",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "password_hash",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Text"
]
},
"nullable": [
false,
false
]
},
"hash": "acf1b96c82ddf18db02e71a0e297c822b46f10add52c54649cf599b883165e58"
}

View File

@@ -0,0 +1,17 @@
{
"db_name": "PostgreSQL",
"query": "\n UPDATE posts\n SET title = $1, content = $2, last_modified = $3 WHERE post_id = $4\n ",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Text",
"Text",
"Timestamptz",
"Uuid"
]
},
"nullable": []
},
"hash": "aef1e780d14be61aa66ae8771309751741068694b291499ee1371de693c6a654"
}

View File

@@ -0,0 +1,14 @@
{
"db_name": "PostgreSQL",
"query": "DELETE FROM posts WHERE post_id = $1",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": []
},
"hash": "b47161386b21432693aa3827963e8167c942e395687cd5ffecb7c064ca2dde70"
}

View File

@@ -1,11 +1,10 @@
{
"db_name": "PostgreSQL",
"query": "\n UPDATE idempotency\n SET\n response_status_code = $3,\n response_headers = $4,\n response_body = $5\n WHERE\n user_id = $1\n AND idempotency_key = $2\n ",
"query": "\n UPDATE idempotency\n SET\n response_status_code = $2,\n response_headers = $3,\n response_body = $4\n WHERE idempotency_key = $1\n ",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid",
"Text",
"Int2",
{
@@ -37,5 +36,5 @@
},
"nullable": []
},
"hash": "32701e61ea14e25608b5f6b05289d08d422e9629d6aee98ac1dcbd50f1edbfe1"
"hash": "b64d5c2e51f328effc8f4687066db96ad695c575fb66195febcdf95c1539a153"
}

View File

@@ -1,46 +0,0 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT p.post_id, u.username AS author, p.title, p.content, p.published_at\n FROM posts p\n LEFT JOIN users u ON p.author_id = u.user_id\n WHERE p.post_id = $1\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "post_id",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "author",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 3,
"name": "content",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "published_at",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": [
false,
false,
false,
false,
false
]
},
"hash": "bccf441e3c1c29ddf6f7f13f7a333adf733abc527da03b12c91422b9b20f3a6f"
}

View File

@@ -0,0 +1,22 @@
{
"db_name": "PostgreSQL",
"query": "SELECT count(*) FROM comments WHERE post_id = $1",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": [
null
]
},
"hash": "bd08bf95dc1c8c0c7678bc509df7ce776e839846f29981e2e0bdfd382de9370f"
}

View File

@@ -0,0 +1,28 @@
{
"db_name": "PostgreSQL",
"query": "SELECT username, full_name FROM users WHERE user_id = $1",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "username",
"type_info": "Text"
},
{
"ordinal": 1,
"name": "full_name",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": [
false,
true
]
},
"hash": "bfd02c92fb5e0c8748b172bf59a77a477b432ada1f41090571f4fe0e685b1b1b"
}

View File

@@ -1,14 +0,0 @@
{
"db_name": "PostgreSQL",
"query": "\n INSERT INTO issue_delivery_queue (\n newsletter_issue_id,\n subscriber_email,\n unsubscribe_token\n )\n SELECT $1, email, unsubscribe_token\n FROM subscriptions\n WHERE status = 'confirmed'\n ",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": []
},
"hash": "ca8fe28bbf395e1c62a495f7299d404043b35f44f639b0edde61ed9e1a7f2944"
}

View File

@@ -0,0 +1,14 @@
{
"db_name": "PostgreSQL",
"query": "DELETE FROM comments WHERE comment_id = $1",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": []
},
"hash": "caf9f2603db6bc8b715cad188501c12f5de5fae49cd04271471f1337a3232f58"
}

View File

@@ -0,0 +1,65 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT p.post_id, p.author_id, u.username AS author, u.full_name,\n p.title, p.content, p.published_at, p.last_modified\n FROM posts p\n LEFT JOIN users u ON p.author_id = u.user_id\n ORDER BY p.published_at DESC\n LIMIT $1\n OFFSET $2\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "post_id",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "author_id",
"type_info": "Uuid"
},
{
"ordinal": 2,
"name": "author",
"type_info": "Text"
},
{
"ordinal": 3,
"name": "full_name",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "content",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "published_at",
"type_info": "Timestamptz"
},
{
"ordinal": 7,
"name": "last_modified",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Int8",
"Int8"
]
},
"nullable": [
false,
false,
false,
true,
false,
false,
false,
true
]
},
"hash": "dc3c1b786b4f4bd65f625922ce05eab4cb161f3de6c6e676af778f7749af5710"
}

View File

@@ -0,0 +1,14 @@
{
"db_name": "PostgreSQL",
"query": "DELETE FROM users WHERE user_id = $1",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": []
},
"hash": "dfa520877c017cd5808d02c24ef2d71938b68093974f335a4d89df91874fdaa2"
}

View File

@@ -0,0 +1,62 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT user_id, username, full_name, role as \"role: Role\", member_since, bio\n FROM users\n WHERE username = $1\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "user_id",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "username",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "full_name",
"type_info": "Text"
},
{
"ordinal": 3,
"name": "role: Role",
"type_info": {
"Custom": {
"name": "user_role",
"kind": {
"Enum": [
"admin",
"writer"
]
}
}
}
},
{
"ordinal": 4,
"name": "member_since",
"type_info": "Timestamptz"
},
{
"ordinal": 5,
"name": "bio",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Text"
]
},
"nullable": [
false,
false,
true,
false,
false,
true
]
},
"hash": "e049f4db1020c0a2979d5ee3c1c0519de59eee8594eb2e472877e5db6bf25271"
}

View File

@@ -0,0 +1,20 @@
{
"db_name": "PostgreSQL",
"query": "SELECT count(*) FROM comments",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": []
},
"nullable": [
null
]
},
"hash": "e056c3230c1ccd1b3b62e902f49a41f21213e0f7da92b428065986d380676034"
}

View File

@@ -0,0 +1,22 @@
{
"db_name": "PostgreSQL",
"query": "SELECT user_id FROM users WHERE username = $1",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "user_id",
"type_info": "Uuid"
}
],
"parameters": {
"Left": [
"Text"
]
},
"nullable": [
false
]
},
"hash": "f4ea2ad9ba4f26093152e4a0e008ef6c3114fbe9e51301611c5633e1cc944c05"
}

View File

@@ -1,20 +1,25 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT title, text_content, html_content\n FROM newsletter_issues\n WHERE newsletter_issue_id = $1\n ",
"query": "\n SELECT newsletter_issue_id, title, text_content, html_content\n FROM newsletter_issues\n WHERE newsletter_issue_id = $1\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "newsletter_issue_id",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 1,
"ordinal": 2,
"name": "text_content",
"type_info": "Text"
},
{
"ordinal": 2,
"ordinal": 3,
"name": "html_content",
"type_info": "Text"
}
@@ -25,10 +30,11 @@
]
},
"nullable": [
false,
false,
false,
false
]
},
"hash": "43116d4e670155129aa69a7563ddc3f7d01ef3689bb8de9ee1757b401ad95b46"
"hash": "f682b1791fb9871c5f7416711caf32637d6303b2c166ef89e7f725b309d2219f"
}

View File

@@ -0,0 +1,15 @@
{
"db_name": "PostgreSQL",
"query": "\n INSERT INTO notifications_delivered (email_id, newsletter_issue_id)\n VALUES ($1, $2)\n ",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid",
"Uuid"
]
},
"nullable": []
},
"hash": "f8afa9b469bf8c216c5855e1d6b7ee05281c9e7779f8fd6486780f882f46e385"
}

View File

@@ -0,0 +1,60 @@
{
"db_name": "PostgreSQL",
"query": "\n SELECT c.user_id as \"user_id?\", u.username as \"username?\", c.comment_id, c.post_id, c.author, c.content, c.published_at\n FROM comments c\n LEFT JOIN users u ON c.user_id = u.user_id AND c.user_id IS NOT NULL\n WHERE c.post_id = $1\n ORDER BY c.published_at DESC\n LIMIT $2\n OFFSET $3\n ",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "user_id?",
"type_info": "Uuid"
},
{
"ordinal": 1,
"name": "username?",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "comment_id",
"type_info": "Uuid"
},
{
"ordinal": 3,
"name": "post_id",
"type_info": "Uuid"
},
{
"ordinal": 4,
"name": "author",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "content",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "published_at",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Uuid",
"Int8",
"Int8"
]
},
"nullable": [
true,
false,
false,
false,
true,
false,
false
]
},
"hash": "fb280849a8a1fce21ec52cd9df73492d965357c9a410eb3b43b1a2e1cc8a0259"
}

View File

@@ -0,0 +1,14 @@
{
"db_name": "PostgreSQL",
"query": "INSERT INTO user_logins (user_id) VALUES ($1)",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Uuid"
]
},
"nullable": []
},
"hash": "fc383671ada951baa611ab7dd00efcc7f4f2aea7c22e4c0865e5c766ed7f99b3"
}

406
Cargo.lock generated
View File

@@ -17,19 +17,6 @@ version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "320119579fcad9c21884f5c4861d16174d0e06250625266f50fe6898340abefa"
[[package]]
name = "ahash"
version = "0.8.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a15f179cd60c4584b8a8c596927aadc462e27f2ca70c04e0071964a73ba7a75"
dependencies = [
"cfg-if",
"getrandom 0.3.3",
"once_cell",
"version_check",
"zerocopy",
]
[[package]]
name = "aho-corasick"
version = "1.1.3"
@@ -240,28 +227,6 @@ dependencies = [
"syn",
]
[[package]]
name = "axum-server"
version = "0.7.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "495c05f60d6df0093e8fb6e74aa5846a0ad06abaf96d76166283720bf740f8ab"
dependencies = [
"arc-swap",
"bytes",
"fs-err",
"http",
"http-body",
"hyper",
"hyper-util",
"pin-project-lite",
"rustls",
"rustls-pemfile",
"rustls-pki-types",
"tokio",
"tokio-rustls",
"tower-service",
]
[[package]]
name = "backtrace"
version = "0.3.75"
@@ -564,6 +529,29 @@ dependencies = [
"typenum",
]
[[package]]
name = "cssparser"
version = "0.35.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4e901edd733a1472f944a45116df3f846f54d37e67e68640ac8bb69689aca2aa"
dependencies = [
"cssparser-macros",
"dtoa-short",
"itoa",
"phf",
"smallvec",
]
[[package]]
name = "cssparser-macros"
version = "0.6.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "13b588ba4ac1a99f7f2964d24b3d896ddc6bf847ee3855dbd4366f058cfcd331"
dependencies = [
"quote",
"syn",
]
[[package]]
name = "darling"
version = "0.20.11"
@@ -638,6 +626,26 @@ dependencies = [
"serde",
]
[[package]]
name = "derive_more"
version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "093242cf7570c207c83073cf82f79706fe7b8317e98620a47d5be7c3d8497678"
dependencies = [
"derive_more-impl",
]
[[package]]
name = "derive_more-impl"
version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bda628edc44c4bb645fbe0f758797143e4e07926f7ebf4e9bdfbd3d2ce621df3"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "deunicode"
version = "1.6.2"
@@ -691,6 +699,27 @@ version = "0.15.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1aaf95b3e5c8f23aa320147307562d361db0ae0d51242340f558153b4eb2439b"
[[package]]
name = "dtoa"
version = "1.0.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d6add3b8cff394282be81f3fc1a0605db594ed69890078ca6e2cab1c408bcf04"
[[package]]
name = "dtoa-short"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd1511a7b6a56299bd043a9c167a6d2bfb37bf84a6dfceaba651168adfb43c87"
dependencies = [
"dtoa",
]
[[package]]
name = "ego-tree"
version = "0.10.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b2972feb8dffe7bc8c5463b1dacda1b0dfbed3710e50f977d965429692d74cd8"
[[package]]
name = "either"
version = "1.15.0"
@@ -847,13 +876,13 @@ dependencies = [
]
[[package]]
name = "fs-err"
version = "3.1.1"
name = "futf"
version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "88d7be93788013f265201256d58f04936a8079ad5dc898743aa20525f503b683"
checksum = "df420e2e84819663797d1ec6544b13c5be84629e7bb00dc960d6917db2987843"
dependencies = [
"autocfg",
"tokio",
"mac",
"new_debug_unreachable",
]
[[package]]
@@ -956,6 +985,15 @@ dependencies = [
"slab",
]
[[package]]
name = "fxhash"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c31b6d751ae2c7f11320402d34e41349dd1016f8d5d45e48c4312bc8625af50c"
dependencies = [
"byteorder",
]
[[package]]
name = "generic-array"
version = "0.14.7"
@@ -967,13 +1005,12 @@ dependencies = [
]
[[package]]
name = "gethostname"
version = "0.2.3"
name = "getopts"
version = "0.2.24"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c1ebd34e35c46e00bb73e81363248d627782724609fe1b6396f553f68fe3862e"
checksum = "cfe4fbac503b8d1f88e6676011885f34b7174f46e59956bba534ba83abded4df"
dependencies = [
"libc",
"winapi",
"unicode-width",
]
[[package]]
@@ -1099,6 +1136,17 @@ dependencies = [
"windows-sys 0.59.0",
]
[[package]]
name = "html5ever"
version = "0.35.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "55d958c2f74b664487a2035fe1dadb032c48718a03b63f3ab0b8537db8549ed4"
dependencies = [
"log",
"markup5ever",
"match_token",
]
[[package]]
name = "http"
version = "1.3.1"
@@ -1502,6 +1550,12 @@ version = "0.1.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "112b39cec0b298b6c1999fee3e31427f74f676e4cb9879ed1a121b43661a4154"
[[package]]
name = "mac"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c41e0c4fef86961ac6d6f8a82609f55f31b05e4fce149ac5710e439df7619ba4"
[[package]]
name = "markdown"
version = "1.0.0"
@@ -1511,6 +1565,28 @@ dependencies = [
"unicode-id",
]
[[package]]
name = "markup5ever"
version = "0.35.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "311fe69c934650f8f19652b3946075f0fc41ad8757dbb68f1ca14e7900ecc1c3"
dependencies = [
"log",
"tendril",
"web_atoms",
]
[[package]]
name = "match_token"
version = "0.35.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ac84fd3f360fcc43dc5f5d186f02a94192761a080e8bc58621ad4d12296a58cf"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "matchers"
version = "0.1.0"
@@ -1584,6 +1660,12 @@ dependencies = [
"windows-sys 0.59.0",
]
[[package]]
name = "new_debug_unreachable"
version = "1.0.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "650eef8c711430f1a879fdd01d4745a7deea475becfb90269c06775983bbf086"
[[package]]
name = "nom"
version = "7.1.3"
@@ -1818,6 +1900,58 @@ dependencies = [
"sha2",
]
[[package]]
name = "phf"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1fd6780a80ae0c52cc120a26a1a42c1ae51b247a253e4e06113d23d2c2edd078"
dependencies = [
"phf_macros",
"phf_shared",
]
[[package]]
name = "phf_codegen"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "aef8048c789fa5e851558d709946d6d79a8ff88c0440c587967f8e94bfb1216a"
dependencies = [
"phf_generator",
"phf_shared",
]
[[package]]
name = "phf_generator"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3c80231409c20246a13fddb31776fb942c38553c51e871f8cbd687a4cfb5843d"
dependencies = [
"phf_shared",
"rand 0.8.5",
]
[[package]]
name = "phf_macros"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f84ac04429c13a7ff43785d75ad27569f2951ce0ffd30a3321230db2fc727216"
dependencies = [
"phf_generator",
"phf_shared",
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "phf_shared"
version = "0.11.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "67eabc2ef2a60eb7faa00097bd1ffdb5bd28e62bf39990626a582201b7a754e5"
dependencies = [
"siphasher",
]
[[package]]
name = "pin-project-lite"
version = "0.2.16"
@@ -1881,6 +2015,12 @@ dependencies = [
"zerocopy",
]
[[package]]
name = "precomputed-hash"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "925383efa346730478fb4838dbe9137d2a47675ad789c546d150a6e1dd4ab31c"
[[package]]
name = "proc-macro-error-attr2"
version = "2.0.0"
@@ -1952,9 +2092,9 @@ dependencies = [
[[package]]
name = "quinn"
version = "0.11.8"
version = "0.11.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "626214629cda6781b6dc1d316ba307189c85ba657213ce642d9c77670f8202c8"
checksum = "b9e20a958963c291dc322d98411f541009df2ced7b5a4f2bd52337638cfccf20"
dependencies = [
"bytes",
"cfg_aliases",
@@ -1963,7 +2103,7 @@ dependencies = [
"quinn-udp",
"rustc-hash",
"rustls",
"socket2 0.5.10",
"socket2 0.6.0",
"thiserror",
"tokio",
"tracing",
@@ -1972,9 +2112,9 @@ dependencies = [
[[package]]
name = "quinn-proto"
version = "0.11.12"
version = "0.11.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "49df843a9161c85bb8aae55f101bc0bac8bcafd637a620d9122fd7e0b2f7422e"
checksum = "f1906b49b0c3bc04b5fe5d86a77925ae6524a19b816ae38ce1e426255f1d8a31"
dependencies = [
"bytes",
"getrandom 0.3.3",
@@ -1993,14 +2133,14 @@ dependencies = [
[[package]]
name = "quinn-udp"
version = "0.5.13"
version = "0.5.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fcebb1209ee276352ef14ff8732e24cc2b02bbac986cd74a4c81bcb2f9881970"
checksum = "addec6a0dcad8a8d96a771f815f0eaf55f9d1805756410b39f5fa81332574cbd"
dependencies = [
"cfg_aliases",
"libc",
"once_cell",
"socket2 0.5.10",
"socket2 0.6.0",
"tracing",
"windows-sys 0.59.0",
]
@@ -2291,15 +2431,6 @@ dependencies = [
"zeroize",
]
[[package]]
name = "rustls-pemfile"
version = "2.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dce314e5fee3f39953d46bb63bb8a46d40c2f8fb7cc5a3b6cab2bde9721d6e50"
dependencies = [
"rustls-pki-types",
]
[[package]]
name = "rustls-pki-types"
version = "1.12.0"
@@ -2339,6 +2470,21 @@ version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
[[package]]
name = "scraper"
version = "0.24.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e5f3a24d916e78954af99281a455168d4a9515d65eca99a18da1b813689c4ad9"
dependencies = [
"cssparser",
"ego-tree",
"getopts",
"html5ever",
"precomputed-hash",
"selectors",
"tendril",
]
[[package]]
name = "secrecy"
version = "0.10.3"
@@ -2349,6 +2495,25 @@ dependencies = [
"zeroize",
]
[[package]]
name = "selectors"
version = "0.31.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5685b6ae43bfcf7d2e7dfcfb5d8e8f61b46442c902531e41a32a9a8bf0ee0fb6"
dependencies = [
"bitflags",
"cssparser",
"derive_more",
"fxhash",
"log",
"new_debug_unreachable",
"phf",
"phf_codegen",
"precomputed-hash",
"servo_arc",
"smallvec",
]
[[package]]
name = "semver"
version = "1.0.26"
@@ -2451,6 +2616,15 @@ dependencies = [
"serde",
]
[[package]]
name = "servo_arc"
version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "204ea332803bd95a0b60388590d59cf6468ec9becf626e2451f1d26a1d972de4"
dependencies = [
"stable_deref_trait",
]
[[package]]
name = "sha1"
version = "0.10.6"
@@ -2498,6 +2672,12 @@ dependencies = [
"rand_core 0.6.4",
]
[[package]]
name = "siphasher"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "56199f7ddabf13fe5074ce809e7d3f42b42ae711800501b5b16ea82ad029c39d"
[[package]]
name = "slab"
version = "0.4.11"
@@ -2756,6 +2936,31 @@ version = "1.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a8f112729512f8e442d81f95a8a7ddf2b7c6b8a1a6f509a95864142b30cab2d3"
[[package]]
name = "string_cache"
version = "0.8.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bf776ba3fa74f83bf4b63c3dcbbf82173db2632ed8452cb2d891d33f459de70f"
dependencies = [
"new_debug_unreachable",
"parking_lot",
"phf_shared",
"precomputed-hash",
"serde",
]
[[package]]
name = "string_cache_codegen"
version = "0.5.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c711928715f1fe0fe509c53b43e993a9a557babc2d0a3567d0a3006f1ac931a0"
dependencies = [
"phf_generator",
"phf_shared",
"proc-macro2",
"quote",
]
[[package]]
name = "stringprep"
version = "0.1.5"
@@ -2810,6 +3015,17 @@ dependencies = [
"syn",
]
[[package]]
name = "tendril"
version = "0.4.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d24a120c5fc464a3458240ee02c299ebcb9d67b5249c8848b09d639dca8d7bb0"
dependencies = [
"futf",
"mac",
"utf-8",
]
[[package]]
name = "thiserror"
version = "2.0.16"
@@ -2935,9 +3151,9 @@ dependencies = [
[[package]]
name = "tokio-rustls"
version = "0.26.2"
version = "0.26.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8e727b36a1a0e8b74c376ac2211e40c2c8af09fb4013c60d910495810f008e9b"
checksum = "05f63835928ca123f1bef57abbcd23bb2ba0ac9ae1235f1e65bda0d06e7786bd"
dependencies = [
"rustls",
"tokio",
@@ -3158,24 +3374,6 @@ dependencies = [
"syn",
]
[[package]]
name = "tracing-bunyan-formatter"
version = "0.3.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2d637245a0d8774bd48df6482e086c59a8b5348a910c3b0579354045a9d82411"
dependencies = [
"ahash",
"gethostname",
"log",
"serde",
"serde_json",
"time",
"tracing",
"tracing-core",
"tracing-log 0.1.4",
"tracing-subscriber",
]
[[package]]
name = "tracing-core"
version = "0.1.34"
@@ -3186,17 +3384,6 @@ dependencies = [
"valuable",
]
[[package]]
name = "tracing-log"
version = "0.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f751112709b4e791d8ce53e32c4ed2d353565a795ce84da2285393f41557bdf2"
dependencies = [
"log",
"once_cell",
"tracing-core",
]
[[package]]
name = "tracing-log"
version = "0.2.0"
@@ -3223,7 +3410,7 @@ dependencies = [
"thread_local",
"tracing",
"tracing-core",
"tracing-log 0.2.0",
"tracing-log",
]
[[package]]
@@ -3301,6 +3488,12 @@ version = "1.12.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f6ccf251212114b54433ec949fd6a7841275f9ada20dddd2f29e9ceea4501493"
[[package]]
name = "unicode-width"
version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4a1a07cc7db3810833284e8d372ccdc6da29741639ecc70c9ec107df0fa6154c"
[[package]]
name = "untrusted"
version = "0.9.0"
@@ -3324,6 +3517,12 @@ version = "2.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "daf8dba3b7eb870caf1ddeed7bc9d2a049f3cfdfae7cb521b087cc33ae4c49da"
[[package]]
name = "utf-8"
version = "0.7.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09cc8ee72d2a9becf2f2febe0205bbed8fc6615b7cb429ad062dc7b7ddd036a9"
[[package]]
name = "utf8_iter"
version = "1.0.4"
@@ -3511,6 +3710,18 @@ dependencies = [
"wasm-bindgen",
]
[[package]]
name = "web_atoms"
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "57ffde1dc01240bdf9992e3205668b235e59421fd085e8a317ed98da0178d414"
dependencies = [
"phf",
"phf_codegen",
"string_cache",
"string_cache_codegen",
]
[[package]]
name = "webpki-roots"
version = "0.26.11"
@@ -3859,8 +4070,6 @@ dependencies = [
"argon2",
"askama",
"axum",
"axum-server",
"base64 0.22.1",
"chrono",
"claims",
"config",
@@ -3872,11 +4081,11 @@ dependencies = [
"quickcheck_macros",
"rand 0.9.2",
"reqwest",
"scraper",
"secrecy",
"serde",
"serde-aux",
"serde_json",
"serde_urlencoded",
"sqlx",
"thiserror",
"tokio",
@@ -3884,10 +4093,7 @@ dependencies = [
"tower-sessions",
"tower-sessions-redis-store",
"tracing",
"tracing-bunyan-formatter",
"tracing-subscriber",
"unicode-segmentation",
"urlencoding",
"uuid",
"validator",
"wiremock",

View File

@@ -11,21 +11,31 @@ path = "src/lib.rs"
path = "src/main.rs"
name = "zero2prod"
[profile.release]
opt-level = 'z'
lto = true
codegen-units = 1
panic = 'abort'
strip = true
rpath = false
debug = false
debug-assertions = false
overflow-checks = false
incremental = false
[dependencies]
anyhow = "1.0.99"
argon2 = { version = "0.5.3", features = ["std"] }
askama = "0.14.0"
axum = { version = "0.8.4", features = ["macros"] }
axum-server = { version = "0.7.2", features = ["tls-rustls-no-provider"] }
base64 = "0.22.1"
chrono = { version = "0.4.41", default-features = false, features = ["clock"] }
config = "0.15.14"
markdown = "1.0.0"
rand = { version = "0.9.2", features = ["std_rng"] }
reqwest = { version = "0.12.23", default-features = false, features = [
"rustls-tls",
"json",
"cookies",
"json",
"rustls-tls",
] }
secrecy = { version = "0.10.3", features = ["serde"] }
serde = { version = "1.0.219", features = ["derive"] }
@@ -44,10 +54,7 @@ tower-http = { version = "0.6.6", features = ["fs", "trace"] }
tower-sessions = "0.14.0"
tower-sessions-redis-store = "0.16.0"
tracing = "0.1.41"
tracing-bunyan-formatter = "0.3.10"
tracing-subscriber = { version = "0.3.19", features = ["env-filter"] }
unicode-segmentation = "1.12.0"
urlencoding = "2.1.3"
uuid = { version = "1.18.0", features = ["v4", "serde"] }
validator = { version = "0.20.0", features = ["derive"] }
@@ -58,6 +65,6 @@ linkify = "0.10.0"
once_cell = "1.21.3"
quickcheck = "1.0.3"
quickcheck_macros = "1.1.0"
scraper = "0.24.0"
serde_json = "1.0.143"
serde_urlencoded = "0.7.1"
wiremock = "0.6.4"

View File

@@ -1,5 +1,6 @@
FROM lukemathwalker/cargo-chef:latest-rust-1.89.0 AS chef
FROM lukemathwalker/cargo-chef:latest-rust-1.90.0 AS chef
WORKDIR /app
RUN apt update && apt install -y nodejs npm clang mold && rm -rf /var/lib/apt/lists/*
FROM chef AS planner
COPY . .
@@ -7,21 +8,17 @@ RUN cargo chef prepare --recipe-path recipe.json
FROM chef AS builder
COPY --from=planner /app/recipe.json recipe.json
RUN apt update -y \
&& apt install -y --no-install-recommends clang mold
RUN cargo chef cook --release --recipe-path recipe.json
COPY . .
ENV SQLX_OFFLINE=true
ENV RUSTFLAGS="-C strip=symbols"
RUN cargo build --release --bin zero2prod
RUN npm install && npm run build-css
FROM debian:bookworm-slim AS runtime
FROM gcr.io/distroless/cc-debian12 AS runtime
WORKDIR /app
RUN apt update -y \
&& apt install -y --no-install-recommends openssl ca-certificates \
&& apt autoremove -y \
&& apt clean -y \
&& rm -rf /var/lib/apt/lists/*
COPY --from=builder /app/target/release/zero2prod zero2prod
COPY configuration configuration
COPY --from=builder /app/assets assets
COPY --from=builder /app/configuration configuration
ENV APP_ENVIRONMENT=production
ENTRYPOINT [ "./zero2prod" ]

File diff suppressed because one or more lines are too long

BIN
assets/favicon.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 874 B

View File

@@ -2,6 +2,8 @@ application:
port: 8080
host: "127.0.0.1"
base_url: "http://127.0.0.1:8080"
email_client:
authorization_token: "secret-token"
database:
host: "127.0.0.1"
port: 5432
@@ -10,7 +12,6 @@ database:
password: "password"
require_ssl: false
timeout_milliseconds: 1000
email_client:
authorization_token: "secret-token"
redis_uri: "redis://127.0.0.1:6379"
require_tls: false
kv_store:
host: "127.0.0.1"
port: 6379

View File

@@ -0,0 +1 @@
ALTER TABLE issue_delivery_queue ADD COLUMN kind TEXT NOT NULL;

View File

@@ -0,0 +1,7 @@
CREATE TABLE notifications_delivered (
email_id UUID PRIMARY KEY,
newsletter_issue_id UUID NOT NULL
REFERENCES newsletter_issues (newsletter_issue_id),
delivered_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
opened BOOLEAN NOT NULL DEFAULT FALSE
);

View File

@@ -0,0 +1,7 @@
CREATE TYPE user_role AS ENUM ('admin', 'writer');
ALTER TABLE users ADD COLUMN role user_role;
UPDATE users SET role = 'admin' WHERE role IS NULL;
ALTER TABLE users ALTER COLUMN role SET NOT NULL;

View File

@@ -0,0 +1,4 @@
ALTER TABLE users
ADD COLUMN full_name TEXT,
ADD COLUMN bio TEXT,
ADD COLUMN member_since TIMESTAMPTZ NOT NULL DEFAULT NOW();

View File

@@ -0,0 +1,7 @@
CREATE TABLE comments (
comment_id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
post_id UUID NOT NULL REFERENCES posts (post_id) ON DELETE CASCADE,
author TEXT,
content TEXT NOT NULL,
published_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);

View File

@@ -0,0 +1,11 @@
ALTER TABLE idempotency
DROP CONSTRAINT idempotency_user_id_fkey;
ALTER TABLE idempotency
DROP CONSTRAINT idempotency_pkey;
ALTER TABLE idempotency
ADD PRIMARY KEY (idempotency_key);
ALTER TABLE idempotency
DROP COLUMN user_id;

View File

@@ -0,0 +1,3 @@
ALTER TABLE comments
ADD COLUMN user_id UUID
REFERENCES users (user_id) ON DELETE SET NULL;

View File

@@ -0,0 +1,2 @@
ALTER TABLE posts
ADD COLUMN last_modified TIMESTAMPTZ;

View File

@@ -0,0 +1,5 @@
CREATE TABLE user_logins (
id BIGSERIAL PRIMARY KEY,
user_id UUID NOT NULL REFERENCES users (user_id) ON DELETE CASCADE,
login_time TIMESTAMPTZ NOT NULL DEFAULT NOW()
);

3
package-lock.json generated
View File

@@ -1123,7 +1123,8 @@
"version": "4.1.13",
"resolved": "https://registry.npmjs.org/tailwindcss/-/tailwindcss-4.1.13.tgz",
"integrity": "sha512-i+zidfmTqtwquj4hMEwdjshYYgMbOrPzb9a0M3ZgNa0JMoZeFC6bxZvO8yr8ozS6ix2SDz0+mvryPeBs2TFE+w==",
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/tapable": {
"version": "2.2.3",

View File

@@ -1,6 +1,7 @@
{
"scripts": {
"build-css": "tailwindcss -i ./templates/input.css -o ./assets/css/main.css --minify --watch"
"build-css": "tailwindcss -i ./templates/input.css -o ./assets/css/main.css --minify",
"watch-css": "tailwindcss -i ./templates/input.css -o ./assets/css/main.css --watch"
},
"dependencies": {
"@tailwindcss/cli": "^4.1.13",

View File

@@ -1,14 +1,12 @@
use crate::{
routes::AdminError, session_state::TypedSession, telemetry::spawn_blocking_with_tracing,
};
use crate::telemetry::spawn_blocking_with_tracing;
use anyhow::Context;
use argon2::{
Algorithm, Argon2, Params, PasswordHash, PasswordHasher, PasswordVerifier, Version,
password_hash::{SaltString, rand_core::OsRng},
};
use axum::{extract::Request, middleware::Next, response::Response};
use secrecy::{ExposeSecret, SecretString};
use sqlx::PgPool;
use std::fmt::Display;
use uuid::Uuid;
pub struct Credentials {
@@ -44,7 +42,7 @@ pub async fn change_password(
Ok(())
}
fn compute_pasword_hash(password: SecretString) -> Result<SecretString, anyhow::Error> {
pub(crate) fn compute_pasword_hash(password: SecretString) -> Result<SecretString, anyhow::Error> {
let salt = SaltString::generate(&mut OsRng);
let password_hash = Argon2::new(
Algorithm::Argon2id,
@@ -56,15 +54,13 @@ fn compute_pasword_hash(password: SecretString) -> Result<SecretString, anyhow::
Ok(SecretString::from(password_hash))
}
#[tracing::instrument(
name = "Validate credentials",
skip(username, password, connection_pool)
)]
#[tracing::instrument(name = "Validate credentials", skip_all)]
pub async fn validate_credentials(
Credentials { username, password }: Credentials,
connection_pool: &PgPool,
) -> Result<Uuid, AuthError> {
) -> Result<(Uuid, Role), AuthError> {
let mut user_id = None;
let mut role = None;
let mut expected_password_hash = SecretString::from(
"$argon2id$v=19$m=15000,t=2,p=1$\
gZiV/M1gPc22ElAH/Jh1Hw$\
@@ -72,13 +68,14 @@ CWOrkoo7oJBQ/iyh7uJ0LO2aLEfrHwTWllSAxT0zRno"
.to_string(),
);
if let Some((stored_user_id, stored_expected_password_hash)) =
if let Some((stored_user_id, stored_expected_password_hash, stored_role)) =
get_stored_credentials(&username, connection_pool)
.await
.context("Failed to retrieve credentials from database.")
.map_err(AuthError::UnexpectedError)?
{
user_id = Some(stored_user_id);
role = Some(stored_role);
expected_password_hash = stored_expected_password_hash;
}
@@ -89,18 +86,19 @@ CWOrkoo7oJBQ/iyh7uJ0LO2aLEfrHwTWllSAxT0zRno"
.ok_or_else(|| anyhow::anyhow!("Unknown username."))
.map_err(AuthError::InvalidCredentials)?;
let role = role
.ok_or_else(|| anyhow::anyhow!("Unknown role."))
.map_err(AuthError::UnexpectedError)?;
handle
.await
.context("Failed to spawn blocking task.")
.map_err(AuthError::UnexpectedError)?
.map_err(AuthError::InvalidCredentials)
.map(|_| uuid)
.map(|_| (uuid, role))
}
#[tracing::instrument(
name = "Verify password",
skip(expected_password_hash, password_candidate)
)]
#[tracing::instrument(name = "Verify password", skip_all)]
fn verify_password_hash(
expected_password_hash: SecretString,
password_candidate: SecretString,
@@ -115,14 +113,14 @@ fn verify_password_hash(
.context("Password verification failed.")
}
#[tracing::instrument(name = "Get stored credentials", skip(username, connection_pool))]
#[tracing::instrument(name = "Get stored credentials", skip(connection_pool))]
async fn get_stored_credentials(
username: &str,
connection_pool: &PgPool,
) -> Result<Option<(Uuid, SecretString)>, sqlx::Error> {
) -> Result<Option<(Uuid, SecretString, Role)>, sqlx::Error> {
let row = sqlx::query!(
r#"
SELECT user_id, password_hash
SELECT user_id, password_hash, role as "role: Role"
FROM users
WHERE username = $1
"#,
@@ -130,37 +128,35 @@ async fn get_stored_credentials(
)
.fetch_optional(connection_pool)
.await?
.map(|row| (row.user_id, SecretString::from(row.password_hash)));
.map(|row| (row.user_id, SecretString::from(row.password_hash), row.role));
Ok(row)
}
pub async fn require_auth(
session: TypedSession,
mut request: Request,
next: Next,
) -> Result<Response, AdminError> {
let user_id = session
.get_user_id()
.await
.map_err(|e| AdminError::UnexpectedError(e.into()))?
.ok_or(AdminError::NotAuthenticated)?;
let username = session
.get_username()
.await
.map_err(|e| AdminError::UnexpectedError(e.into()))?
.ok_or(AdminError::UnexpectedError(anyhow::anyhow!(
"Could not find username in session."
)))?;
#[derive(serde::Serialize, serde::Deserialize, Debug, Clone, Copy, PartialEq, Eq, sqlx::Type)]
#[sqlx(type_name = "user_role", rename_all = "lowercase")]
pub enum Role {
Admin,
Writer,
}
request
.extensions_mut()
.insert(AuthenticatedUser { user_id, username });
Ok(next.run(request).await)
impl Display for Role {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Role::Admin => write!(f, "admin"),
Role::Writer => write!(f, "writer"),
}
}
}
#[derive(Clone)]
pub struct AuthenticatedUser {
pub user_id: Uuid,
pub username: String,
pub role: Role,
}
impl AuthenticatedUser {
pub fn is_admin(&self) -> bool {
matches!(self.role, Role::Admin)
}
}

View File

@@ -1,8 +1,13 @@
use crate::domain::SubscriberEmail;
use anyhow::Context;
use secrecy::{ExposeSecret, SecretString};
use serde::Deserialize;
use serde_aux::field_attributes::deserialize_number_from_string;
use sqlx::postgres::{PgConnectOptions, PgSslMode};
use tower_sessions_redis_store::{
RedisStore,
fred::prelude::{ClientLike, Pool},
};
pub fn get_configuration() -> Result<Settings, config::ConfigError> {
let base_path = std::env::current_dir().expect("Failed to determine the current directory");
@@ -60,8 +65,7 @@ pub struct Settings {
pub application: ApplicationSettings,
pub database: DatabaseSettings,
pub email_client: EmailClientSettings,
pub redis_uri: SecretString,
pub require_tls: bool,
pub kv_store: RedisSettings,
}
#[derive(Clone, Deserialize)]
@@ -101,6 +105,35 @@ impl EmailClientSettings {
}
}
#[derive(Clone, Deserialize)]
pub struct RedisSettings {
pub host: String,
pub port: u16,
}
impl RedisSettings {
pub fn connection_string(&self) -> String {
format!("redis://{}:{}", self.host, self.port)
}
pub async fn session_store(&self) -> Result<RedisStore<Pool>, anyhow::Error> {
let pool = Pool::new(
tower_sessions_redis_store::fred::prelude::Config::from_url(&self.connection_string())
.context("Failed to parse Redis URL string.")?,
None,
None,
None,
6,
)
.unwrap();
pool.connect();
pool.wait_for_connect()
.await
.context("Failed to connect to the Redis server.")?;
Ok(RedisStore::new(pool))
}
}
#[derive(Clone, Deserialize)]
pub struct DatabaseSettings {
pub username: String,

58
src/database_worker.rs Normal file
View File

@@ -0,0 +1,58 @@
use anyhow::Context;
use sqlx::{
PgPool,
postgres::{PgConnectOptions, PgPoolOptions},
};
use std::time::Duration;
pub async fn run_until_stopped(configuration: PgConnectOptions) -> Result<(), anyhow::Error> {
let connection_pool = PgPoolOptions::new().connect_lazy_with(configuration);
worker_loop(connection_pool).await
}
async fn worker_loop(connection_pool: PgPool) -> Result<(), anyhow::Error> {
loop {
if let Err(e) = clean_pending_subscriptions(&connection_pool).await {
tracing::error!("{:?}", e);
}
if let Err(e) = clean_idempotency_keys(&connection_pool).await {
tracing::error!("{:?}", e);
}
tokio::time::sleep(Duration::from_secs(60)).await;
}
}
async fn clean_pending_subscriptions(connection_pool: &PgPool) -> Result<(), anyhow::Error> {
let result = sqlx::query!(
"
DELETE FROM subscriptions
WHERE status = 'pending_confirmation'
AND subscribed_at < NOW() - INTERVAL '24 hours'
"
)
.execute(connection_pool)
.await
.context("Failed to clean up subscriptions table.")?;
match result.rows_affected() {
n if n > 0 => tracing::info!("Cleaned up {} expired subscriptions.", n),
_ => (),
}
Ok(())
}
async fn clean_idempotency_keys(connection_pool: &PgPool) -> Result<(), anyhow::Error> {
let result = sqlx::query!(
"
DELETE FROM idempotency
WHERE created_at < NOW() - INTERVAL '1 hour'
"
)
.execute(connection_pool)
.await
.context("Failed to clean up idempontency table.")?;
match result.rows_affected() {
n if n > 0 => tracing::info!("Cleaned up {} old idempotency records.", n),
_ => (),
}
Ok(())
}

View File

@@ -1,7 +1,13 @@
mod comment;
mod new_subscriber;
mod post;
mod subscriber_email;
mod subscribers;
mod user;
pub use comment::CommentEntry;
pub use new_subscriber::NewSubscriber;
pub use post::PostEntry;
pub use subscriber_email::SubscriberEmail;
pub use subscribers::SubscriberEntry;
pub use user::UserEntry;

18
src/domain/comment.rs Normal file
View File

@@ -0,0 +1,18 @@
use chrono::{DateTime, Utc};
use uuid::Uuid;
pub struct CommentEntry {
pub user_id: Option<Uuid>,
pub username: Option<String>,
pub comment_id: Uuid,
pub post_id: Uuid,
pub author: Option<String>,
pub content: String,
pub published_at: DateTime<Utc>,
}
impl CommentEntry {
pub fn formatted_date(&self) -> String {
self.published_at.format("%B %d, %Y %H:%M").to_string()
}
}

View File

@@ -3,25 +3,23 @@ use uuid::Uuid;
pub struct PostEntry {
pub post_id: Uuid,
pub author: Option<String>,
pub author_id: Uuid,
pub author: String,
pub full_name: Option<String>,
pub title: String,
pub content: String,
pub published_at: DateTime<Utc>,
pub last_modified: Option<DateTime<Utc>>,
}
impl PostEntry {
#[allow(dead_code)]
pub fn formatted_date(&self) -> String {
self.published_at.format("%B %d, %Y").to_string()
self.published_at.format("%B %d, %Y %H:%M").to_string()
}
pub fn to_html(self) -> Result<Self, anyhow::Error> {
pub fn to_html(&self) -> anyhow::Result<String> {
match markdown::to_html_with_options(&self.content, &markdown::Options::gfm()) {
Ok(mut content) => {
content = content.replace("<table>", r#"<div class="table-wrapper"><table>"#);
content = content.replace("</table>", r#"</table></div>"#);
Ok(Self { content, ..self })
}
Ok(content) => Ok(content),
Err(e) => anyhow::bail!(e),
}
}

View File

@@ -1,3 +1,5 @@
use std::fmt::Display;
use validator::Validate;
#[derive(Debug, Validate)]
@@ -22,6 +24,12 @@ impl AsRef<str> for SubscriberEmail {
}
}
impl Display for SubscriberEmail {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.email)
}
}
#[cfg(test)]
mod tests {
use super::SubscriberEmail;

21
src/domain/subscribers.rs Normal file
View File

@@ -0,0 +1,21 @@
use chrono::{DateTime, Utc};
use uuid::Uuid;
pub struct SubscriberEntry {
pub id: Uuid,
pub email: String,
pub subscribed_at: DateTime<Utc>,
pub status: String,
pub unsubscribe_token: Option<String>,
}
impl SubscriberEntry {
pub fn confirmed(&self) -> bool {
self.status == "confirmed"
}
#[allow(dead_code)]
pub fn formatted_date(&self) -> String {
self.subscribed_at.format("%B %d, %Y").to_string()
}
}

22
src/domain/user.rs Normal file
View File

@@ -0,0 +1,22 @@
use crate::authentication::Role;
use chrono::{DateTime, Utc};
use uuid::Uuid;
pub struct UserEntry {
pub user_id: Uuid,
pub username: String,
pub role: Role,
pub full_name: Option<String>,
pub bio: Option<String>,
pub member_since: DateTime<Utc>,
}
impl UserEntry {
pub fn formatted_date(&self) -> String {
self.member_since.format("%B %d, %Y").to_string()
}
pub fn is_admin(&self) -> bool {
matches!(self.role, Role::Admin)
}
}

View File

@@ -119,7 +119,7 @@ mod tests {
EmailClient::build(settings).unwrap()
}
#[tokio::test]
#[sqlx::test]
async fn send_email_sends_the_expected_request() {
let mock_server = MockServer::start().await;
let email_client = email_client(mock_server.uri());
@@ -141,7 +141,7 @@ mod tests {
.unwrap();
}
#[tokio::test]
#[sqlx::test]
async fn send_email_succeeds_if_the_server_returns_200() {
let mock_server = MockServer::start().await;
let email_client = email_client(mock_server.uri());
@@ -159,7 +159,7 @@ mod tests {
assert_ok!(response);
}
#[tokio::test]
#[sqlx::test]
async fn send_email_fails_if_the_server_retuns_500() {
let mock_server = MockServer::start().await;
let email_client = email_client(mock_server.uri());
@@ -177,7 +177,7 @@ mod tests {
assert_err!(response);
}
#[tokio::test]
#[sqlx::test]
async fn send_email_times_out_if_the_server_takes_too_long() {
let mock_server = MockServer::start().await;
let email_client = email_client(mock_server.uri());

View File

@@ -1,3 +1,4 @@
#[derive(Debug)]
pub struct IdempotencyKey(String);
impl TryFrom<String> for IdempotencyKey {

View File

@@ -7,7 +7,6 @@ use axum::{
use reqwest::StatusCode;
use sqlx::{Executor, PgPool, Postgres, Transaction};
use std::str::FromStr;
use uuid::Uuid;
#[derive(Debug, sqlx::Type)]
#[sqlx(type_name = "header_pair")]
@@ -16,10 +15,13 @@ struct HeaderPairRecord {
value: Vec<u8>,
}
#[tracing::instrument(
name = "Fetching saved response in database if it exists",
skip(connection_pool)
)]
pub async fn get_saved_response(
connection_pool: &PgPool,
idempotency_key: &IdempotencyKey,
user_id: Uuid,
) -> Result<Option<Response>, anyhow::Error> {
let saved_response = sqlx::query!(
r#"
@@ -28,11 +30,8 @@ pub async fn get_saved_response(
response_headers as "response_headers!: Vec<HeaderPairRecord>",
response_body as "response_body!"
FROM idempotency
WHERE
user_id = $1
AND idempotency_key = $2
WHERE idempotency_key = $1
"#,
user_id,
idempotency_key.as_ref()
)
.fetch_optional(connection_pool)
@@ -53,10 +52,10 @@ pub async fn get_saved_response(
}
}
#[tracing::instrument(name = "Saving response in database", skip(transaction, response))]
pub async fn save_response(
mut transaction: Transaction<'static, Postgres>,
idempotency_key: &IdempotencyKey,
user_id: Uuid,
response: Response<Body>,
) -> Result<Response<Body>, anyhow::Error> {
let status_code = response.status().as_u16() as i16;
@@ -75,14 +74,11 @@ pub async fn save_response(
r#"
UPDATE idempotency
SET
response_status_code = $3,
response_headers = $4,
response_body = $5
WHERE
user_id = $1
AND idempotency_key = $2
response_status_code = $2,
response_headers = $3,
response_body = $4
WHERE idempotency_key = $1
"#,
user_id,
idempotency_key.as_ref(),
status_code,
headers,
@@ -104,23 +100,21 @@ pub enum NextAction {
pub async fn try_processing(
connection_pool: &PgPool,
idempotency_key: &IdempotencyKey,
user_id: Uuid,
) -> Result<NextAction, anyhow::Error> {
let mut transaction = connection_pool.begin().await?;
let query = sqlx::query!(
r#"
INSERT INTO idempotency (user_id, idempotency_key, created_at)
VALUES ($1, $2, now())
INSERT INTO idempotency (idempotency_key, created_at)
VALUES ($1, now())
ON CONFLICT DO NOTHING
"#,
user_id,
idempotency_key.as_ref()
);
let n_inserted_rows = transaction.execute(query).await?.rows_affected();
if n_inserted_rows > 0 {
Ok(NextAction::StartProcessing(transaction))
} else {
let saved_response = get_saved_response(connection_pool, idempotency_key, user_id)
let saved_response = get_saved_response(connection_pool, idempotency_key)
.await?
.ok_or_else(|| anyhow::anyhow!("Could not find saved response."))?;
Ok(NextAction::ReturnSavedResponse(saved_response))

View File

@@ -1,10 +1,13 @@
use crate::{configuration::Settings, domain::SubscriberEmail, email_client::EmailClient};
use crate::{
configuration::Settings, domain::SubscriberEmail, email_client::EmailClient, routes::EmailType,
};
use anyhow::Context;
use sqlx::{Executor, PgPool, Postgres, Row, Transaction, postgres::PgPoolOptions};
use std::time::Duration;
use tracing::{Span, field::display};
use uuid::Uuid;
pub async fn run_worker_until_stopped(configuration: Settings) -> Result<(), anyhow::Error> {
pub async fn run_until_stopped(configuration: Settings) -> Result<(), anyhow::Error> {
let connection_pool = PgPoolOptions::new().connect_lazy_with(configuration.database.with_db());
let email_client = EmailClient::build(configuration.email_client).unwrap();
worker_loop(connection_pool, email_client).await
@@ -28,48 +31,32 @@ pub enum ExecutionOutcome {
EmptyQueue,
}
#[tracing::instrument(
skip_all,
fields(
newsletter_issue_id=tracing::field::Empty,
subscriber_email=tracing::field::Empty
),
err
)]
pub async fn try_execute_task(
connection_pool: &PgPool,
email_client: &EmailClient,
) -> Result<ExecutionOutcome, anyhow::Error> {
let task = dequeue_task(connection_pool).await?;
if task.is_none() {
return Ok(ExecutionOutcome::EmptyQueue);
}
let (transaction, task) = task.unwrap();
let (mut transaction, task) = match task {
Some((transaction, task)) => (transaction, task),
None => return Ok(ExecutionOutcome::EmptyQueue),
};
Span::current()
.record("newsletter_issue_id", display(task.newsletter_issue_id))
.record("subscriber_email", display(&task.subscriber_email));
match SubscriberEmail::parse(task.subscriber_email.clone()) {
Ok(email) => {
let mut issue = get_issue(connection_pool, task.newsletter_issue_id).await?;
issue.inject_unsubscribe_token(&task.unsubscribe_token);
if let Err(e) = email_client
.send_email(
&email,
&issue.title,
&issue.html_content,
&issue.text_content,
)
.await
{
tracing::error!(
error.message = %e,
"Failed to deliver issue to confirmed subscriber. Skipping."
);
}
execute_task(
connection_pool,
&mut transaction,
&task,
email,
email_client,
)
.await?;
}
Err(e) => {
tracing::error!(
error.message = %e,
error = %e,
"Skipping a subscriber. Their stored contact details are invalid."
);
}
@@ -85,6 +72,7 @@ pub async fn try_execute_task(
}
struct NewsletterIssue {
newsletter_issue_id: Uuid,
title: String,
text_content: String,
html_content: String,
@@ -95,9 +83,30 @@ impl NewsletterIssue {
self.text_content = self.text_content.replace("UNSUBSCRIBE_TOKEN", token);
self.html_content = self.html_content.replace("UNSUBSCRIBE_TOKEN", token);
}
async fn inject_tracking_info(
&mut self,
transaction: &mut Transaction<'static, Postgres>,
) -> Result<(), anyhow::Error> {
let email_id = Uuid::new_v4();
let query = sqlx::query!(
r#"
INSERT INTO notifications_delivered (email_id, newsletter_issue_id)
VALUES ($1, $2)
"#,
email_id,
self.newsletter_issue_id
);
transaction
.execute(query)
.await
.context("Failed to store email tracking info.")?;
self.text_content = self.text_content.replace("EMAIL_ID", &email_id.to_string());
self.html_content = self.html_content.replace("EMAIL_ID", &email_id.to_string());
Ok(())
}
}
#[tracing::instrument(skip_all)]
async fn get_issue(
connection_pool: &PgPool,
issue_id: Uuid,
@@ -105,7 +114,7 @@ async fn get_issue(
let issue = sqlx::query_as!(
NewsletterIssue,
r#"
SELECT title, text_content, html_content
SELECT newsletter_issue_id, title, text_content, html_content
FROM newsletter_issues
WHERE newsletter_issue_id = $1
"#,
@@ -120,16 +129,16 @@ pub struct Task {
pub newsletter_issue_id: Uuid,
pub subscriber_email: String,
pub unsubscribe_token: String,
pub kind: String,
}
#[tracing::instrument(skip_all)]
async fn dequeue_task(
connection_pool: &PgPool,
) -> Result<Option<(Transaction<'static, Postgres>, Task)>, anyhow::Error> {
let mut transaction = connection_pool.begin().await?;
let query = sqlx::query!(
r#"
SELECT newsletter_issue_id, subscriber_email, unsubscribe_token
SELECT newsletter_issue_id, subscriber_email, unsubscribe_token, kind
FROM issue_delivery_queue
FOR UPDATE
SKIP LOCKED
@@ -142,6 +151,7 @@ async fn dequeue_task(
newsletter_issue_id: row.get("newsletter_issue_id"),
subscriber_email: row.get("subscriber_email"),
unsubscribe_token: row.get("unsubscribe_token"),
kind: row.get("kind"),
};
Ok(Some((transaction, task)))
} else {
@@ -149,7 +159,35 @@ async fn dequeue_task(
}
}
#[tracing::instrument(skip_all)]
#[tracing::instrument(
name = "Executing task",
skip_all,
fields(email = %email),
)]
async fn execute_task(
connection_pool: &PgPool,
transaction: &mut Transaction<'static, Postgres>,
task: &Task,
email: SubscriberEmail,
email_client: &EmailClient,
) -> Result<(), anyhow::Error> {
let mut issue = get_issue(connection_pool, task.newsletter_issue_id).await?;
issue.inject_unsubscribe_token(&task.unsubscribe_token);
if task.kind == EmailType::NewPost.to_string() {
issue.inject_tracking_info(transaction).await?;
}
email_client
.send_email(
&email,
&issue.title,
&issue.html_content,
&issue.text_content,
)
.await
.context("Failed to deliver newsletter issue to subscriber..")?;
Ok(())
}
async fn delete_task(
mut transaction: Transaction<'static, Postgres>,
issue_id: Uuid,

View File

@@ -1,5 +1,6 @@
pub mod authentication;
pub mod configuration;
pub mod database_worker;
pub mod domain;
pub mod email_client;
pub mod idempotency;

View File

@@ -1,6 +1,6 @@
use zero2prod::{
configuration::get_configuration, issue_delivery_worker::run_worker_until_stopped,
startup::Application, telemetry::init_subscriber,
configuration::get_configuration, database_worker, issue_delivery_worker, startup::Application,
telemetry::init_subscriber,
};
#[tokio::main]
@@ -11,11 +11,16 @@ async fn main() -> Result<(), anyhow::Error> {
let application = Application::build(configuration.clone()).await?;
let application_task = tokio::spawn(application.run_until_stopped());
let worker_task = tokio::spawn(run_worker_until_stopped(configuration));
let database_worker_task = tokio::spawn(database_worker::run_until_stopped(
configuration.database.with_db(),
));
let delivery_worker_task =
tokio::spawn(issue_delivery_worker::run_until_stopped(configuration));
tokio::select! {
_ = application_task => {},
_ = worker_task => {},
_ = database_worker_task => {},
_ = delivery_worker_task => {},
};
Ok(())

View File

@@ -1,4 +1,5 @@
mod admin;
mod comments;
mod health_check;
mod home;
mod login;
@@ -6,26 +7,32 @@ mod posts;
mod subscriptions;
mod subscriptions_confirm;
mod unsubscribe;
mod users;
pub use admin::*;
use askama::Template;
use axum::{
http::HeaderMap,
extract::FromRequestParts,
http::{HeaderMap, request::Parts},
response::{Html, IntoResponse, Response},
};
pub use comments::*;
pub use health_check::*;
pub use home::*;
pub use login::*;
pub use posts::*;
use rand::{Rng, distr::Alphanumeric};
use reqwest::StatusCode;
use serde::de::DeserializeOwned;
pub use subscriptions::*;
pub use subscriptions_confirm::*;
pub use unsubscribe::*;
pub use users::*;
use validator::ValidationErrors;
use crate::{
authentication::AuthError,
templates::{InternalErrorTemplate, MessageTemplate, NotFoundTemplate},
templates::{ErrorTemplate, HtmlTemplate, MessageTemplate},
};
pub fn generate_token() -> String {
@@ -61,6 +68,8 @@ pub enum AppError {
FormError(#[source] anyhow::Error),
#[error("Authentication is required.")]
NotAuthenticated,
#[error("Handler extractor failed.")]
Extractor(#[source] anyhow::Error),
}
impl From<anyhow::Error> for AppError {
@@ -104,19 +113,16 @@ impl IntoResponse for AppError {
full_page,
} => {
let html = if *full_page {
Html(InternalErrorTemplate.render().unwrap())
Html(ErrorTemplate::InternalServer.render().unwrap())
} else {
let template = MessageTemplate::Error {
message: "An internal server error occured.".into(),
};
let template =
MessageTemplate::error("An internal server error occured.".into());
Html(template.render().unwrap())
};
html.into_response()
}
AppError::FormError(error) => {
let template = MessageTemplate::Error {
message: error.to_string(),
};
let template = MessageTemplate::error(error.to_string());
Html(template.render().unwrap()).into_response()
}
AppError::NotAuthenticated => {
@@ -124,6 +130,7 @@ impl IntoResponse for AppError {
headers.insert("HX-Redirect", "/login".parse().unwrap());
(StatusCode::OK, headers).into_response()
}
AppError::Extractor(_) => not_found_html(),
}
}
}
@@ -155,9 +162,68 @@ impl From<AuthError> for AppError {
}
pub async fn not_found() -> Response {
(
StatusCode::NOT_FOUND,
Html(NotFoundTemplate.render().unwrap()),
)
.into_response()
tracing::error!("Not found.");
not_found_html()
}
pub fn not_found_html() -> Response {
let template = HtmlTemplate(ErrorTemplate::NotFound);
(StatusCode::NOT_FOUND, template).into_response()
}
pub struct Path<T>(T);
impl<T, S> FromRequestParts<S> for Path<T>
where
T: DeserializeOwned + Send,
S: Send + Sync,
{
type Rejection = AppError;
async fn from_request_parts(parts: &mut Parts, state: &S) -> Result<Self, Self::Rejection> {
match axum::extract::Path::<T>::from_request_parts(parts, state).await {
Ok(value) => Ok(Self(value.0)),
Err(rejection) => Err(AppError::Extractor(anyhow::anyhow!(
"Path rejection: {:?}",
rejection
))),
}
}
}
pub struct Query<T>(pub T);
impl<T, S> FromRequestParts<S> for Query<T>
where
T: DeserializeOwned,
S: Send + Sync,
{
type Rejection = AppError;
async fn from_request_parts(parts: &mut Parts, state: &S) -> Result<Self, Self::Rejection> {
match axum::extract::Query::<T>::from_request_parts(parts, state).await {
Ok(value) => Ok(Self(value.0)),
Err(rejection) => Err(AppError::Extractor(anyhow::anyhow!(
"Query rejection: {:?}",
rejection
))),
}
}
}
pub fn join_error_messages(e: ValidationErrors) -> String {
let error_messages: Vec<_> = e
.field_errors()
.iter()
.flat_map(|(field, errors)| {
errors.iter().map(move |error| {
error
.message
.as_ref()
.map(|msg| msg.to_string())
.unwrap_or(format!("Invalid field: {}", field))
})
})
.collect();
error_messages.join("\n")
}

View File

@@ -3,18 +3,27 @@ mod dashboard;
mod logout;
mod newsletters;
mod posts;
mod subscribers;
use crate::{
authentication::AuthenticatedUser,
routes::{AppError, error_chain_fmt},
session_state::TypedSession,
templates::{HtmlTemplate, MessageTemplate},
};
use anyhow::Context;
use axum::response::Redirect;
use axum::{
extract::Request,
middleware::Next,
response::{IntoResponse, Response},
};
use axum::{extract::Request, middleware::Next, response::Response};
pub use change_password::*;
pub use dashboard::*;
pub use logout::*;
pub use newsletters::*;
pub use posts::*;
pub use subscribers::*;
#[derive(thiserror::Error)]
pub enum AdminError {
@@ -23,7 +32,7 @@ pub enum AdminError {
#[error("Trying to access admin dashboard without authentication.")]
NotAuthenticated,
#[error("Updating password failed.")]
ChangePassword(String),
ChangePassword(anyhow::Error),
#[error("Could not publish newsletter.")]
Publish(#[source] anyhow::Error),
#[error("The idempotency key was invalid.")]
@@ -41,11 +50,17 @@ pub async fn require_auth(
mut request: Request,
next: Next,
) -> Result<Response, AppError> {
let user_id = session
let user_id = match session
.get_user_id()
.await
.map_err(|e| AdminError::UnexpectedError(e.into()))?
.ok_or(AdminError::NotAuthenticated)?;
{
None => {
tracing::error!("Not authenticated. Redirecting to /login.");
return Ok(Redirect::to("/login").into_response());
}
Some(user_id) => user_id,
};
let username = session
.get_username()
.await
@@ -53,10 +68,36 @@ pub async fn require_auth(
.ok_or(AdminError::UnexpectedError(anyhow::anyhow!(
"Could not find username in session."
)))?;
let role = session
.get_role()
.await
.context("Error retrieving user role in session.")?
.ok_or(anyhow::anyhow!("Could not find user role in session."))?;
request
.extensions_mut()
.insert(AuthenticatedUser { user_id, username });
request.extensions_mut().insert(AuthenticatedUser {
user_id,
username,
role,
});
Ok(next.run(request).await)
}
pub async fn require_admin(
session: TypedSession,
request: Request,
next: Next,
) -> Result<Response, AppError> {
if session
.has_admin_permissions()
.await
.context("Error retrieving user role in session.")?
{
Ok(next.run(request).await)
} else {
Ok(HtmlTemplate(MessageTemplate::error(
"This action requires administrator privileges.".into(),
))
.into_response())
}
}

View File

@@ -20,7 +20,9 @@ pub struct PasswordFormData {
}
pub async fn change_password(
Extension(AuthenticatedUser { user_id, username }): Extension<AuthenticatedUser>,
Extension(AuthenticatedUser {
user_id, username, ..
}): Extension<AuthenticatedUser>,
State(AppState {
connection_pool, ..
}): State<AppState>,
@@ -31,16 +33,16 @@ pub async fn change_password(
password: form.current_password,
};
if form.new_password.expose_secret() != form.new_password_check.expose_secret() {
Err(AdminError::ChangePassword(
"You entered two different passwords - the field values must match.".to_string(),
)
Err(AdminError::ChangePassword(anyhow::anyhow!(
"You entered two different passwords - the field values must match."
))
.into())
} else if let Err(e) = validate_credentials(credentials, &connection_pool).await {
match e {
AuthError::UnexpectedError(error) => Err(AdminError::UnexpectedError(error).into()),
AuthError::InvalidCredentials(_) => Err(AdminError::ChangePassword(
"The current password is incorrect.".to_string(),
)
AuthError::InvalidCredentials(_) => Err(AdminError::ChangePassword(anyhow::anyhow!(
"The current password is incorrect."
))
.into()),
}
} else if let Err(e) = verify_password(form.new_password.expose_secret()) {
@@ -48,17 +50,15 @@ pub async fn change_password(
} else {
authentication::change_password(user_id, form.new_password, &connection_pool)
.await
.map_err(|e| AdminError::ChangePassword(e.to_string()))?;
let template = MessageTemplate::Success {
message: "Your password has been changed.".to_string(),
};
.map_err(AdminError::ChangePassword)?;
let template = MessageTemplate::success("Your password has been changed.".to_string());
Ok(Html(template.render().unwrap()).into_response())
}
}
fn verify_password(password: &str) -> Result<(), String> {
pub fn verify_password(password: &str) -> Result<(), anyhow::Error> {
if password.len() < 12 || password.len() > 128 {
return Err("The password must contain between 12 and 128 characters.".into());
anyhow::bail!("The password must contain between 12 and 128 characters.");
}
Ok(())
}

View File

@@ -1,20 +1,133 @@
use crate::{authentication::AuthenticatedUser, templates::DashboardTemplate};
use crate::routes::{
COMMENTS_PER_PAGE, POSTS_PER_PAGE, SUBS_PER_PAGE, get_comments_count, get_comments_page,
get_posts_count, get_posts_page, get_users,
};
use crate::{
authentication::AuthenticatedUser,
routes::{AppError, get_max_page, get_subs, get_total_subs},
startup::AppState,
templates::DashboardTemplate,
};
use anyhow::Context;
use askama::Template;
use axum::{
Extension,
extract::State,
response::{Html, IntoResponse, Response},
};
use sqlx::PgPool;
use uuid::Uuid;
pub struct DashboardStats {
pub subscribers: i64,
pub posts: i64,
pub notifications_sent: i64,
pub open_rate: f64,
}
impl DashboardStats {
pub fn formatted_rate(&self) -> String {
format!("{:.1}%", self.open_rate)
}
}
pub async fn admin_dashboard(
Extension(AuthenticatedUser { username, .. }): Extension<AuthenticatedUser>,
) -> Response {
State(AppState {
connection_pool, ..
}): State<AppState>,
Extension(user): Extension<AuthenticatedUser>,
) -> Result<Response, AppError> {
let stats = get_stats(&connection_pool).await?;
let idempotency_key_1 = Uuid::new_v4().to_string();
let idempotency_key_2 = Uuid::new_v4().to_string();
let current_page = 1;
let subscribers = get_subs(&connection_pool, current_page)
.await
.context("Could not fetch subscribers from database.")
.map_err(AppError::unexpected_message)?;
let subs_count = get_total_subs(&connection_pool)
.await
.context("Could not fetch total subscribers count from the database.")?;
let max_page = get_max_page(subs_count, SUBS_PER_PAGE);
let users = get_users(&connection_pool)
.await
.context("Could not fetch users")?;
let posts = get_posts_page(&connection_pool, 1)
.await
.context("Could not fetch posts.")?;
let posts_current_page = 1;
let posts_count = get_posts_count(&connection_pool)
.await
.context("Could not fetch posts count.")?;
let posts_max_page = get_max_page(posts_count, POSTS_PER_PAGE);
let comments_current_page = 1;
let comments = get_comments_page(&connection_pool, comments_current_page)
.await
.context("Could not fetch comments.")?;
let comments_count = get_comments_count(&connection_pool)
.await
.context("Could not fetch comments count.")?;
let comments_max_page = get_max_page(comments_count, COMMENTS_PER_PAGE);
let template = DashboardTemplate {
username,
user,
idempotency_key_1,
idempotency_key_2,
stats,
subscribers,
current_page,
max_page,
count: subs_count,
users,
posts,
posts_current_page,
posts_max_page,
posts_count,
comments,
comments_current_page,
comments_max_page,
comments_count,
};
Html(template.render().unwrap()).into_response()
Ok(Html(template.render().unwrap()).into_response())
}
#[tracing::instrument("Computing dashboard stats", skip_all)]
async fn get_stats(connection_pool: &PgPool) -> Result<DashboardStats, anyhow::Error> {
let subscribers =
sqlx::query_scalar!("SELECT count(*) FROM subscriptions WHERE status = 'confirmed'")
.fetch_one(connection_pool)
.await
.context("Failed to fetch subscribers count.")?
.unwrap_or(0);
let posts = sqlx::query_scalar!("SELECT count(*) FROM posts")
.fetch_one(connection_pool)
.await
.context("Failed to fetch posts count.")?
.unwrap_or(0);
let notifications_sent = sqlx::query_scalar!("SELECT count(*) FROM notifications_delivered")
.fetch_one(connection_pool)
.await
.context("Failed to fetch notifications sent count.")?
.unwrap_or(0);
let opened =
sqlx::query_scalar!("SELECT count(*) FROM notifications_delivered WHERE opened = TRUE")
.fetch_one(connection_pool)
.await
.context("Failed to fetch notifications sent count.")?
.unwrap_or(0);
let open_rate = if notifications_sent == 0 {
0.0
} else {
(opened as f64) / (notifications_sent as f64) * 100.0
};
Ok(DashboardStats {
subscribers,
posts,
notifications_sent,
open_rate,
})
}

View File

@@ -1,5 +1,4 @@
use crate::{
authentication::AuthenticatedUser,
idempotency::{IdempotencyKey, save_response, try_processing},
routes::{AdminError, AppError},
startup::AppState,
@@ -8,11 +7,12 @@ use crate::{
use anyhow::Context;
use askama::Template;
use axum::{
Extension, Form,
Form,
extract::State,
response::{Html, IntoResponse, Response},
};
use sqlx::{Executor, Postgres, Transaction};
use std::fmt::Display;
use uuid::Uuid;
#[derive(serde::Deserialize)]
@@ -23,13 +23,14 @@ pub struct BodyData {
idempotency_key: String,
}
#[tracing::instrument(skip_all)]
#[tracing::instrument(name = "Creating newsletter isue", skip_all, fields(issue_id = tracing::field::Empty))]
pub async fn insert_newsletter_issue(
transaction: &mut Transaction<'static, Postgres>,
title: &str,
email_template: &dyn EmailTemplate,
) -> Result<Uuid, sqlx::Error> {
let newsletter_issue_id = Uuid::new_v4();
tracing::Span::current().record("issue_id", newsletter_issue_id.to_string());
let query = sqlx::query!(
r#"
INSERT INTO newsletter_issues (
@@ -46,36 +47,53 @@ pub async fn insert_newsletter_issue(
Ok(newsletter_issue_id)
}
#[tracing::instrument(skip_all)]
#[derive(Debug)]
pub enum EmailType {
NewPost,
Newsletter,
}
impl Display for EmailType {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
EmailType::NewPost => write!(f, "new_post"),
EmailType::Newsletter => write!(f, "newsletter"),
}
}
}
#[tracing::instrument(name = "Adding new task to queue", skip(transaction))]
pub async fn enqueue_delivery_tasks(
transaction: &mut Transaction<'static, Postgres>,
newsletter_issue_id: Uuid,
kind: EmailType,
) -> Result<(), sqlx::Error> {
let query = sqlx::query!(
r#"
INSERT INTO issue_delivery_queue (
newsletter_issue_id,
subscriber_email,
unsubscribe_token
unsubscribe_token,
kind
)
SELECT $1, email, unsubscribe_token
SELECT $1, email, unsubscribe_token, $2
FROM subscriptions
WHERE status = 'confirmed'
"#,
newsletter_issue_id,
kind.to_string()
);
transaction.execute(query).await?;
Ok(())
}
#[tracing::instrument(name = "Publishing a newsletter", skip(connection_pool, form))]
#[tracing::instrument(name = "Publishing a newsletter", skip_all, fields(title = %form.title))]
pub async fn publish_newsletter(
State(AppState {
connection_pool,
base_url,
..
}): State<AppState>,
Extension(AuthenticatedUser { user_id, .. }): Extension<AuthenticatedUser>,
Form(form): Form<BodyData>,
) -> Result<Response, AppError> {
validate_form(&form).map_err(|e| AdminError::Publish(anyhow::anyhow!(e)))?;
@@ -85,7 +103,7 @@ pub async fn publish_newsletter(
.try_into()
.map_err(AdminError::Idempotency)?;
let mut transaction = match try_processing(&connection_pool, &idempotency_key, user_id).await? {
let mut transaction = match try_processing(&connection_pool, &idempotency_key).await? {
crate::idempotency::NextAction::StartProcessing(t) => t,
crate::idempotency::NextAction::ReturnSavedResponse(response) => {
return Ok(response);
@@ -102,28 +120,25 @@ pub async fn publish_newsletter(
.await
.context("Failed to store newsletter issue details.")?;
enqueue_delivery_tasks(&mut transaction, issue_id)
enqueue_delivery_tasks(&mut transaction, issue_id, EmailType::Newsletter)
.await
.context("Failed to enqueue delivery tasks.")?;
let message = format!(
r#"The newsletter issue "{}" has been published!"#,
form.title
);
let template = MessageTemplate::Success { message };
let message = String::from("Your email has been queued for delivery.");
let template = MessageTemplate::success(message);
let response = Html(template.render().unwrap()).into_response();
let response = save_response(transaction, &idempotency_key, user_id, response)
let response = save_response(transaction, &idempotency_key, response)
.await
.map_err(AdminError::UnexpectedError)?;
Ok(response)
}
fn validate_form(form: &BodyData) -> Result<(), &'static str> {
fn validate_form(form: &BodyData) -> Result<(), anyhow::Error> {
if form.title.is_empty() {
return Err("The title was empty.");
anyhow::bail!("The title was empty.");
}
if form.html.is_empty() || form.text.is_empty() {
return Err("The content was empty.");
anyhow::bail!("The content was empty.");
}
Ok(())
}

View File

@@ -1,7 +1,9 @@
use crate::{
authentication::AuthenticatedUser,
idempotency::{IdempotencyKey, save_response, try_processing},
routes::{AdminError, AppError, enqueue_delivery_tasks, insert_newsletter_issue},
routes::{
AdminError, AppError, EmailType, Path, enqueue_delivery_tasks, insert_newsletter_issue,
},
startup::AppState,
templates::{MessageTemplate, NewPostEmailTemplate},
};
@@ -31,7 +33,11 @@ fn validate_form(form: &CreatePostForm) -> Result<(), anyhow::Error> {
}
}
#[tracing::instrument(name = "Creating a post", skip(connection_pool, form))]
#[tracing::instrument(
name = "Publishing new blog post",
skip(connection_pool, base_url, form)
fields(title = %form.title)
)]
pub async fn create_post(
State(AppState {
connection_pool,
@@ -48,7 +54,7 @@ pub async fn create_post(
.try_into()
.map_err(AdminError::Idempotency)?;
let mut transaction = match try_processing(&connection_pool, &idempotency_key, user_id).await? {
let mut transaction = match try_processing(&connection_pool, &idempotency_key).await? {
crate::idempotency::NextAction::StartProcessing(t) => t,
crate::idempotency::NextAction::ReturnSavedResponse(response) => {
return Ok(response);
@@ -63,24 +69,19 @@ pub async fn create_post(
.await
.context("Failed to create newsletter.")?;
enqueue_delivery_tasks(&mut transaction, newsletter_uuid)
enqueue_delivery_tasks(&mut transaction, newsletter_uuid, EmailType::NewPost)
.await
.context("Failed to enqueue delivery tasks.")?;
let template = MessageTemplate::Success {
message: "Your new post has been saved. Subscribers will be notified.".into(),
};
let template = MessageTemplate::success("Your new post has been published!".into());
let response = Html(template.render().unwrap()).into_response();
let response = save_response(transaction, &idempotency_key, user_id, response)
let response = save_response(transaction, &idempotency_key, response)
.await
.map_err(AdminError::UnexpectedError)?;
Ok(response)
}
#[tracing::instrument(
name = "Saving new post in the database",
skip(transaction, title, content, author)
)]
#[tracing::instrument(name = "Saving new blog post in the database", skip_all)]
pub async fn insert_post(
transaction: &mut Transaction<'static, Postgres>,
title: &str,
@@ -103,10 +104,7 @@ pub async fn insert_post(
Ok(post_id)
}
#[tracing::instrument(
name = "Creating newsletter for new post",
skip(transaction, post_title, post_id)
)]
#[tracing::instrument(name = "Creating newsletter for new post", skip_all)]
pub async fn create_newsletter(
transaction: &mut Transaction<'static, Postgres>,
base_url: &str,
@@ -121,3 +119,24 @@ pub async fn create_newsletter(
};
insert_newsletter_issue(transaction, post_title, &template).await
}
pub async fn delete_post(
State(AppState {
connection_pool, ..
}): State<AppState>,
Path(post_id): Path<Uuid>,
) -> Result<Response, AppError> {
let res = sqlx::query!("DELETE FROM posts WHERE post_id = $1", post_id)
.execute(&connection_pool)
.await
.context("Failed to delete post from database.")
.map_err(AppError::unexpected_message)?;
if res.rows_affected() > 1 {
Err(AppError::unexpected_message(anyhow::anyhow!(
"We could not find the post in the database."
)))
} else {
let template = MessageTemplate::success("The post has been deleted.".into());
Ok(template.render().unwrap().into_response())
}
}

View File

@@ -0,0 +1,116 @@
use crate::{
domain::SubscriberEntry,
routes::{AppError, Path, Query},
startup::AppState,
templates::{MessageTemplate, SubListTemplate},
};
use anyhow::Context;
use askama::Template;
use axum::{
extract::State,
response::{Html, IntoResponse, Response},
};
use sqlx::PgPool;
use uuid::Uuid;
pub const SUBS_PER_PAGE: i64 = 5;
#[tracing::instrument(name = "Retrieving subscribers from database", skip(connection_pool))]
pub async fn get_subscribers_page(
State(AppState {
connection_pool, ..
}): State<AppState>,
Query(SubsQueryParams { page }): Query<SubsQueryParams>,
) -> Result<Response, AppError> {
let count = get_total_subs(&connection_pool)
.await
.context("Could not fetch total subscribers count from the database.")
.map_err(AppError::unexpected_message)?;
let max_page = get_max_page(count, SUBS_PER_PAGE);
let subscribers = get_subs(&connection_pool, page)
.await
.context("Could not fetch subscribers data.")
.map_err(AppError::unexpected_message)?;
let template = SubListTemplate {
subscribers,
current_page: page,
max_page,
};
Ok(Html(template.render().unwrap()).into_response())
}
#[tracing::instrument(
name = "Deleting subscriber from database",
skip(connection_pool),
fields(email=tracing::field::Empty)
)]
pub async fn delete_subscriber(
State(AppState {
connection_pool, ..
}): State<AppState>,
Path(subscriber_id): Path<Uuid>,
) -> Result<Response, AppError> {
let res = sqlx::query!(
"DELETE FROM subscriptions WHERE id = $1 RETURNING email",
subscriber_id
)
.fetch_optional(&connection_pool)
.await
.context("Failed to delete subscriber from database.")
.map_err(AppError::unexpected_message)?;
if let Some(record) = res {
tracing::Span::current().record("email", tracing::field::display(&record.email));
let template = MessageTemplate::success(format!(
"The subscriber with email '{}' has been deleted.",
record.email
));
Ok(template.render().unwrap().into_response())
} else {
Err(AppError::unexpected_message(anyhow::anyhow!(
"We could not find the subscriber in the database."
)))
}
}
#[tracing::instrument(
name = "Retrieving next subscribers in database",
skip(connection_pool),
fields(offset = tracing::field::Empty)
)]
pub async fn get_subs(
connection_pool: &PgPool,
page: i64,
) -> Result<Vec<SubscriberEntry>, sqlx::Error> {
let offset = (page - 1) * SUBS_PER_PAGE;
tracing::Span::current().record("offset", tracing::field::display(&offset));
let subscribers = sqlx::query_as!(
SubscriberEntry,
"SELECT * FROM subscriptions ORDER BY subscribed_at DESC LIMIT $1 OFFSET $2",
SUBS_PER_PAGE,
offset
)
.fetch_all(connection_pool)
.await?;
Ok(subscribers)
}
pub async fn get_total_subs(connection_pool: &PgPool) -> Result<i64, sqlx::Error> {
let count = sqlx::query_scalar!("SELECT count(*) FROM subscriptions")
.fetch_one(connection_pool)
.await?
.unwrap_or(0);
Ok(count)
}
pub fn get_max_page(count: i64, num_per_page: i64) -> i64 {
let mut max_page = count.div_euclid(num_per_page);
if count % num_per_page > 0 {
max_page += 1;
}
max_page
}
#[derive(serde::Deserialize)]
pub struct SubsQueryParams {
page: i64,
}

264
src/routes/comments.rs Normal file
View File

@@ -0,0 +1,264 @@
use crate::idempotency::{IdempotencyKey, save_response, try_processing};
use crate::routes::{AdminError, get_max_page};
use crate::templates::CommentsPageDashboardTemplate;
use crate::{
domain::CommentEntry,
routes::AppError,
startup::AppState,
templates::{CommentsList, HtmlTemplate, MessageTemplate},
};
use anyhow::Context;
use askama::Template;
use axum::{
Form,
extract::{Path, Query, State},
response::{IntoResponse, Response},
};
use sqlx::{Executor, PgPool, Postgres, Transaction};
use uuid::Uuid;
#[derive(serde::Deserialize)]
pub struct CommentPathParam {
post_id: Uuid,
}
#[derive(serde::Deserialize)]
pub struct CommentForm {
pub author: Option<String>,
pub content: String,
pub idempotency_key: String,
pub user_id: Option<Uuid>,
}
#[tracing::instrument(name = "Posting new comment", skip_all, fields(post_id = %post_id))]
pub async fn post_comment(
Path(CommentPathParam { post_id }): Path<CommentPathParam>,
State(AppState {
connection_pool, ..
}): State<AppState>,
Form(form): Form<CommentForm>,
) -> Result<Response, AppError> {
validate_form(&form)?;
let idempotency_key: IdempotencyKey = form
.idempotency_key
.try_into()
.map_err(AdminError::Idempotency)?;
let mut transaction = match try_processing(&connection_pool, &idempotency_key).await? {
crate::idempotency::NextAction::StartProcessing(t) => t,
crate::idempotency::NextAction::ReturnSavedResponse(response) => {
return Ok(response);
}
};
insert_comment(
&mut transaction,
post_id,
form.author,
form.user_id,
form.content,
)
.await
.context("Could not insert comment into database.")?;
let template = HtmlTemplate(MessageTemplate::success(
"Your comment has been posted.".into(),
));
let response = template.into_response();
let response = save_response(transaction, &idempotency_key, response).await?;
Ok(response)
}
fn validate_form(form: &CommentForm) -> Result<(), anyhow::Error> {
if form.content.is_empty() {
anyhow::bail!("Comment content cannot be empty.");
}
Ok(())
}
#[tracing::instrument(name = "Inserting new comment in database", skip_all, fields(comment_id = tracing::field::Empty))]
async fn insert_comment(
transaction: &mut Transaction<'static, Postgres>,
post_id: Uuid,
author: Option<String>,
user_id: Option<Uuid>,
content: String,
) -> Result<Uuid, sqlx::Error> {
let author = if user_id.is_some() {
None
} else {
author
.filter(|s| !s.trim().is_empty())
.map(|s| s.trim().to_string())
};
let content = content.trim();
let comment_id = Uuid::new_v4();
tracing::Span::current().record("comment_id", comment_id.to_string());
let query = sqlx::query!(
"
INSERT INTO comments (user_id, comment_id, post_id, author, content)
VALUES ($1, $2, $3, $4, $5)
",
user_id,
comment_id,
post_id,
author,
content,
);
transaction.execute(query).await?;
Ok(comment_id)
}
pub const COMMENTS_PER_PAGE: i64 = 5;
#[derive(serde::Deserialize)]
pub struct GetCommentsQueryParams {
page: i64,
}
#[tracing::instrument(name = "Fetching comments", skip(connection_pool))]
pub async fn get_comments(
Path(CommentPathParam { post_id }): Path<CommentPathParam>,
Query(GetCommentsQueryParams { page }): Query<GetCommentsQueryParams>,
State(AppState {
connection_pool, ..
}): State<AppState>,
) -> Result<Response, AppError> {
let comments = get_comments_page_for_post(&connection_pool, post_id, page)
.await
.context("Could not fetch comments.")?;
let count = get_comments_count_for_post(&connection_pool, post_id)
.await
.context("Could not fetch comments count")?;
let max_page = get_max_page(count, COMMENTS_PER_PAGE);
let template = HtmlTemplate(CommentsList {
comments,
current_page: page,
max_page,
});
Ok(template.into_response())
}
#[tracing::instrument(name = "Fetching all comments", skip(connection_pool))]
pub async fn get_all_comments(
Query(GetCommentsQueryParams { page }): Query<GetCommentsQueryParams>,
State(AppState {
connection_pool, ..
}): State<AppState>,
) -> Result<Response, AppError> {
let comments = get_comments_page(&connection_pool, page)
.await
.context("Could not fetch comments.")?;
let count = get_comments_count(&connection_pool)
.await
.context("Could not fetch comments count")?;
let comments_max_page = get_max_page(count, COMMENTS_PER_PAGE);
let template = HtmlTemplate(CommentsPageDashboardTemplate {
comments,
comments_current_page: page,
comments_max_page,
});
Ok(template.into_response())
}
pub async fn delete_comment(
State(AppState {
connection_pool, ..
}): State<AppState>,
crate::routes::Path(comment_id): crate::routes::Path<Uuid>,
) -> Result<Response, AppError> {
let res = sqlx::query!("DELETE FROM comments WHERE comment_id = $1", comment_id)
.execute(&connection_pool)
.await
.context("Failed to delete comment from database.")
.map_err(AppError::unexpected_message)?;
if res.rows_affected() > 1 {
Err(AppError::unexpected_message(anyhow::anyhow!(
"We could not find the comment in the database."
)))
} else {
let template = MessageTemplate::success("The comment has been deleted.".into());
Ok(template.render().unwrap().into_response())
}
}
pub async fn get_comments_page_for_post(
connection_pool: &PgPool,
post_id: Uuid,
page: i64,
) -> Result<Vec<CommentEntry>, sqlx::Error> {
let offset = (page - 1) * COMMENTS_PER_PAGE;
let mut comments = sqlx::query_as!(
CommentEntry,
r#"
SELECT c.user_id as "user_id?", u.username as "username?", c.comment_id, c.post_id, c.author, c.content, c.published_at
FROM comments c
LEFT JOIN users u ON c.user_id = u.user_id AND c.user_id IS NOT NULL
WHERE c.post_id = $1
ORDER BY c.published_at DESC
LIMIT $2
OFFSET $3
"#,
post_id,
COMMENTS_PER_PAGE,
offset
)
.fetch_all(connection_pool)
.await?;
for comment in comments.iter_mut() {
if let Some(user_id) = comment.user_id {
let record = sqlx::query!(
"SELECT username, full_name FROM users WHERE user_id = $1",
user_id
)
.fetch_one(connection_pool)
.await?;
let author = record.full_name.unwrap_or(record.username);
comment.author = Some(author);
}
}
Ok(comments)
}
pub async fn get_comments_count_for_post(
connection_pool: &PgPool,
post_id: Uuid,
) -> Result<i64, sqlx::Error> {
let count = sqlx::query_scalar!("SELECT count(*) FROM comments WHERE post_id = $1", post_id)
.fetch_one(connection_pool)
.await?
.unwrap_or(0);
Ok(count)
}
pub async fn get_comments_page(
connection_pool: &PgPool,
page: i64,
) -> Result<Vec<CommentEntry>, sqlx::Error> {
let offset = (page - 1) * COMMENTS_PER_PAGE;
let comments = sqlx::query_as!(
CommentEntry,
r#"
SELECT c.user_id as "user_id?", u.username as "username?", c.comment_id, c.post_id, c.author, c.content, c.published_at
FROM comments c
LEFT JOIN users u ON c.user_id = u.user_id AND c.user_id IS NOT NULL
ORDER BY published_at DESC
LIMIT $1
OFFSET $2
"#,
COMMENTS_PER_PAGE,
offset
)
.fetch_all(connection_pool)
.await?;
Ok(comments)
}
pub async fn get_comments_count(connection_pool: &PgPool) -> Result<i64, sqlx::Error> {
let count = sqlx::query_scalar!("SELECT count(*) FROM comments")
.fetch_one(connection_pool)
.await?
.unwrap_or(0);
Ok(count)
}

View File

@@ -1,5 +1,5 @@
use axum::{http::StatusCode, response::IntoResponse};
use axum::http::StatusCode;
pub async fn health_check() -> impl IntoResponse {
pub async fn health_check() -> StatusCode {
StatusCode::OK
}

View File

@@ -1,8 +1,7 @@
use askama::Template;
use axum::response::Html;
use crate::templates::{HomeTemplate, HtmlTemplate};
use axum::response::{IntoResponse, Response};
use crate::templates::HomeTemplate;
pub async fn home() -> Html<String> {
Html(HomeTemplate.render().unwrap())
pub async fn home() -> Response {
let template = HtmlTemplate(HomeTemplate);
template.into_response()
}

View File

@@ -15,6 +15,8 @@ use axum::{
};
use axum::{http::StatusCode, response::Redirect};
use secrecy::SecretString;
use sqlx::PgPool;
use uuid::Uuid;
#[derive(serde::Deserialize)]
pub struct LoginFormData {
@@ -29,12 +31,13 @@ pub async fn get_login(session: TypedSession) -> Result<Response, AppError> {
.context("Failed to retrieve user id from data store.")?
.is_some()
{
Ok(Redirect::to("/admin/dashboard").into_response())
Ok(Redirect::to("dashboard").into_response())
} else {
Ok(Html(LoginTemplate.render().unwrap()).into_response())
}
}
#[tracing::instrument(name = "Authenticating user", skip_all, fields(name = %form.username))]
pub async fn post_login(
session: TypedSession,
State(AppState {
@@ -47,8 +50,11 @@ pub async fn post_login(
password: form.password,
};
tracing::Span::current().record("username", tracing::field::display(&credentials.username));
let user_id = validate_credentials(credentials, &connection_pool).await?;
let (user_id, role) = validate_credentials(credentials, &connection_pool).await?;
tracing::Span::current().record("user_id", tracing::field::display(&user_id));
record_login(&connection_pool, &user_id)
.await
.context("Failed to register new login event.")?;
session.renew().await.context("Failed to renew session.")?;
session
@@ -59,8 +65,20 @@ pub async fn post_login(
.insert_username(form.username)
.await
.context("Failed to insert username in session data store.")?;
session
.insert_role(role)
.await
.context("Failed to insert role in session data store.")?;
let mut headers = HeaderMap::new();
headers.insert("HX-Redirect", "/admin/dashboard".parse().unwrap());
headers.insert("HX-Redirect", "/dashboard".parse().unwrap());
Ok((StatusCode::OK, headers).into_response())
}
#[tracing::instrument(name = "Recording new login event", skip_all, fields(user_id = %user_id))]
async fn record_login(connection_pool: &PgPool, user_id: &Uuid) -> Result<(), sqlx::Error> {
sqlx::query!("INSERT INTO user_logins (user_id) VALUES ($1)", user_id)
.execute(connection_pool)
.await?;
Ok(())
}

View File

@@ -1,74 +1,308 @@
use crate::authentication::AuthenticatedUser;
use crate::routes::{COMMENTS_PER_PAGE, Query, get_max_page, join_error_messages};
use crate::session_state::TypedSession;
use crate::templates::{ErrorTemplate, MessageTemplate, PostsPageDashboardTemplate};
use crate::{
domain::PostEntry,
routes::AppError,
routes::{
AppError, Path, get_comments_count_for_post, get_comments_page_for_post, not_found_html,
},
startup::AppState,
templates::{PostTemplate, PostsTemplate},
templates::{HtmlTemplate, PostListTemplate, PostTemplate, PostsTemplate},
};
use anyhow::Context;
use askama::Template;
use axum::{
extract::{Path, State},
response::{Html, IntoResponse, Response},
Extension, Form,
extract::State,
response::{Html, IntoResponse, Redirect, Response},
};
use chrono::Utc;
use sqlx::PgPool;
use uuid::Uuid;
use validator::Validate;
pub const POSTS_PER_PAGE: i64 = 3;
#[tracing::instrument(name = "Fetching most recent posts from database", skip_all)]
pub async fn list_posts(
State(AppState {
connection_pool, ..
}): State<AppState>,
) -> Result<Response, AppError> {
let posts = get_latest_posts(&connection_pool, 5)
let count = get_posts_count(&connection_pool)
.await
.context("Could not fetch posts table size.")
.map_err(AppError::unexpected_page)?;
let next_page = if count > POSTS_PER_PAGE {
Some(2)
} else {
None
};
let posts = get_posts(&connection_pool, POSTS_PER_PAGE, None)
.await
.context("Could not fetch latest posts")
.map_err(AppError::unexpected_page)?;
let template = PostsTemplate { posts };
let template = PostsTemplate { posts, next_page };
Ok(Html(template.render().unwrap()).into_response())
}
async fn get_latest_posts(connection_pool: &PgPool, n: i64) -> Result<Vec<PostEntry>, sqlx::Error> {
#[tracing::instrument(name = "Fetching next posts from database", skip_all)]
pub async fn get_posts_page_dashboard(
State(AppState {
connection_pool, ..
}): State<AppState>,
Query(LoadMoreParams { page }): Query<LoadMoreParams>,
) -> Result<Response, AppError> {
let posts = get_posts_page(&connection_pool, page)
.await
.context("Could not fetch next posts page.")?;
let posts_current_page = page;
let count = get_posts_count(&connection_pool)
.await
.context("Could not fetch number of posts.")?;
let posts_max_page = get_max_page(count, POSTS_PER_PAGE);
let template = HtmlTemplate(PostsPageDashboardTemplate {
posts,
posts_current_page,
posts_max_page,
});
Ok(template.into_response())
}
async fn get_posts(
connection_pool: &PgPool,
n: i64,
offset: Option<i64>,
) -> Result<Vec<PostEntry>, sqlx::Error> {
sqlx::query_as!(
PostEntry,
r#"
SELECT p.post_id, u.username AS author, p.title, p.content, p.published_at
SELECT p.post_id, p.author_id, u.username AS author, u.full_name,
p.title, p.content, p.published_at, p.last_modified
FROM posts p
LEFT JOIN users u ON p.author_id = u.user_id
ORDER BY p.published_at DESC
LIMIT $1
OFFSET $2
"#,
n
n,
offset
)
.fetch_all(connection_pool)
.await
}
pub async fn get_posts_page(
connection_pool: &PgPool,
page: i64,
) -> Result<Vec<PostEntry>, sqlx::Error> {
let offset = (page - 1) * POSTS_PER_PAGE;
sqlx::query_as!(
PostEntry,
r#"
SELECT p.post_id, p.author_id, u.username AS author, u.full_name,
p.title, p.content, p.published_at, p.last_modified
FROM posts p
LEFT JOIN users u ON p.author_id = u.user_id
ORDER BY p.published_at DESC
LIMIT $1
OFFSET $2
"#,
POSTS_PER_PAGE,
offset
)
.fetch_all(connection_pool)
.await
}
pub async fn get_posts_count(connection_pool: &PgPool) -> Result<i64, sqlx::Error> {
sqlx::query!("SELECT count(*) FROM posts")
.fetch_one(connection_pool)
.await
.map(|r| r.count.unwrap())
}
#[derive(Validate, serde::Deserialize)]
pub struct EditPostForm {
#[validate(length(min = 1, message = "Title must be at least one character."))]
pub title: String,
#[validate(length(min = 1, message = "Content must be at least one character."))]
pub content: String,
}
#[tracing::instrument(name = "Editing post", skip_all, fields(post_id = %post_id))]
pub async fn update_post(
State(AppState {
connection_pool, ..
}): State<AppState>,
Extension(AuthenticatedUser { user_id, .. }): Extension<AuthenticatedUser>,
Path(post_id): Path<Uuid>,
Form(form): Form<EditPostForm>,
) -> Result<Response, AppError> {
let record = sqlx::query!("SELECT author_id FROM posts WHERE post_id = $1", post_id)
.fetch_optional(&connection_pool)
.await
.context("Could not fetch post author.")?;
match record {
None => Ok(HtmlTemplate(ErrorTemplate::NotFound).into_response()),
Some(record) if record.author_id == user_id => {
if let Err(e) = form.validate().map_err(join_error_messages) {
let template = HtmlTemplate(MessageTemplate::error(e));
return Ok(template.into_response());
}
sqlx::query!(
"
UPDATE posts
SET title = $1, content = $2, last_modified = $3 WHERE post_id = $4
",
form.title,
form.content,
Utc::now(),
post_id
)
.execute(&connection_pool)
.await
.context("Could not update post")?;
Ok(HtmlTemplate(MessageTemplate::success(
"Your changes have been saved.".into(),
))
.into_response())
}
_ => Ok(HtmlTemplate(MessageTemplate::error(
"You are not authorized. Only the author can edit his post.".into(),
))
.into_response()),
}
}
#[derive(serde::Deserialize)]
pub struct OriginQueryParam {
origin: Option<Uuid>,
}
#[tracing::instrument(
name = "Fetching post from database",
skip(connection_pool, origin, session)
)]
pub async fn see_post(
session: TypedSession,
State(AppState {
connection_pool, ..
}): State<AppState>,
Path(post_id): Path<Uuid>,
Query(OriginQueryParam { origin }): Query<OriginQueryParam>,
) -> Result<Response, AppError> {
let post = get_post(&connection_pool, post_id)
if let Some(origin) = origin {
mark_email_as_opened(&connection_pool, origin).await?;
return Ok(Redirect::to(&format!("/posts/{}", post_id)).into_response());
}
if let Some(post) = get_post_data(&connection_pool, post_id)
.await
.context(format!("Failed to fetch post #{}", post_id))
.context(format!("Failed to fetch post #{}.", post_id))
.map_err(AppError::unexpected_page)?
.to_html()
.context("Could not render markdown with extension.")?;
let template = PostTemplate { post };
Ok(Html(template.render().unwrap()).into_response())
{
let post_html = post
.to_html()
.context("Could not render markdown with extension.")?;
let current_page = 1;
let comments_count = get_comments_count_for_post(&connection_pool, post_id)
.await
.context("Could not fetch comment count.")?;
let max_page = get_max_page(comments_count, COMMENTS_PER_PAGE);
let comments = get_comments_page_for_post(&connection_pool, post_id, 1)
.await
.context("Failed to fetch latest comments.")?;
let idempotency_key = Uuid::new_v4().to_string();
let session_user_id = session
.get_user_id()
.await
.context("Could not check for session user id.")?;
let session_username = session
.get_username()
.await
.context("Could not check for session username.")?;
let template = HtmlTemplate(PostTemplate {
post,
post_html,
comments,
idempotency_key,
current_page,
max_page,
comments_count,
session_user_id,
session_username,
});
Ok(template.into_response())
} else {
Ok(not_found_html())
}
}
async fn get_post(connection_pool: &PgPool, post_id: Uuid) -> Result<PostEntry, sqlx::Error> {
#[tracing::instrument(name = "Mark email notification as opened", skip(connection_pool))]
async fn mark_email_as_opened(connection_pool: &PgPool, email_id: Uuid) -> Result<(), AppError> {
sqlx::query!(
"UPDATE notifications_delivered SET opened = TRUE WHERE email_id = $1",
email_id,
)
.execute(connection_pool)
.await
.context("Failed to mark email as opened.")
.map_err(AppError::unexpected_page)?;
Ok(())
}
async fn get_post_data(
connection_pool: &PgPool,
post_id: Uuid,
) -> Result<Option<PostEntry>, sqlx::Error> {
sqlx::query_as!(
PostEntry,
r#"
SELECT p.post_id, u.username AS author, p.title, p.content, p.published_at
SELECT p.post_id, p.author_id, u.username AS author, u.full_name,
p.title, p.content, p.published_at, last_modified
FROM posts p
LEFT JOIN users u ON p.author_id = u.user_id
WHERE p.post_id = $1
"#,
post_id
)
.fetch_one(connection_pool)
.fetch_optional(connection_pool)
.await
}
#[derive(serde::Deserialize)]
pub struct LoadMoreParams {
page: i64,
}
#[tracing::instrument(name = "Fetching next posts in the database", skip(connection_pool))]
pub async fn load_more(
State(AppState {
connection_pool, ..
}): State<AppState>,
Query(LoadMoreParams { page }): Query<LoadMoreParams>,
) -> Result<Response, AppError> {
let posts = get_posts_page(&connection_pool, page)
.await
.context("Could not fetch posts from database.")?;
let count = get_posts_count(&connection_pool)
.await
.context("Could not fetch posts count.")?;
let max_page = get_max_page(count, POSTS_PER_PAGE);
Ok(Html(
PostListTemplate {
posts,
next_page: if page < max_page {
Some(page + 1)
} else {
None
},
}
.render()
.unwrap(),
)
.into_response())
}

View File

@@ -19,9 +19,9 @@ use uuid::Uuid;
#[tracing::instrument(
name = "Adding a new subscriber",
skip(connection_pool, email_client, base_url, form),
skip_all,
fields(
subscriber_email = %form.email,
email = %form.email,
)
)]
pub async fn subscribe(
@@ -66,16 +66,12 @@ pub async fn subscribe(
.context("Failed to commit the database transaction to store a new subscriber.")?;
}
let template = MessageTemplate::Success {
message: "You'll receive a confirmation email shortly.".to_string(),
};
let template =
MessageTemplate::success("You'll receive a confirmation email shortly.".to_string());
Ok(Html(template.render().unwrap()).into_response())
}
#[tracing::instrument(
name = "Saving new subscriber details in the database",
skip(transaction, new_subscriber)
)]
#[tracing::instrument(name = "Saving new subscriber details in the database", skip_all)]
pub async fn insert_subscriber(
transaction: &mut Transaction<'_, Postgres>,
new_subscriber: &NewSubscriber,
@@ -123,10 +119,7 @@ async fn store_token(
Ok(())
}
#[tracing::instrument(
name = "Send a confirmation email to a new subscriber",
skip(email_client, new_subscriber, base_url, subscription_token)
)]
#[tracing::instrument(name = "Send confirmation email to the new subscriber", skip_all)]
pub async fn send_confirmation_email(
email_client: &EmailClient,
new_subscriber: &NewSubscriber,

View File

@@ -1,44 +1,39 @@
use crate::{routes::generate_token, startup::AppState, templates::ConfirmTemplate};
use askama::Template;
use crate::{
routes::{AppError, Query, generate_token, not_found_html},
startup::AppState,
templates::{ConfirmTemplate, HtmlTemplate},
};
use anyhow::Context;
use axum::{
extract::{Query, State},
http::StatusCode,
response::{Html, IntoResponse, Response},
extract::State,
response::{IntoResponse, Response},
};
use serde::Deserialize;
use sqlx::PgPool;
use uuid::Uuid;
#[tracing::instrument(name = "Confirming new subscriber", skip(params))]
#[tracing::instrument(name = "Confirming new subscriber", skip_all)]
pub async fn confirm(
State(AppState {
connection_pool, ..
}): State<AppState>,
Query(params): Query<Params>,
) -> Response {
let Ok(subscriber_id) =
get_subscriber_id_from_token(&connection_pool, &params.subscription_token).await
else {
return StatusCode::INTERNAL_SERVER_ERROR.into_response();
};
if let Some(subscriber_id) = subscriber_id {
if confirm_subscriber(&connection_pool, &subscriber_id)
) -> Result<Response, AppError> {
let subscriber_id = get_subscriber_id_from_token(&connection_pool, &params.subscription_token)
.await
.context("Could not fetch subscriber id given subscription token.")?;
if let Some(id) = subscriber_id {
confirm_subscriber(&connection_pool, &id)
.await
.is_err()
{
StatusCode::INTERNAL_SERVER_ERROR.into_response()
} else {
Html(ConfirmTemplate.render().unwrap()).into_response()
}
.context("Failed to update subscriber status.")?;
let template = HtmlTemplate(ConfirmTemplate);
Ok(template.into_response())
} else {
StatusCode::UNAUTHORIZED.into_response()
Ok(not_found_html())
}
}
#[tracing::instrument(
name = "Mark subscriber as confirmed",
skip(connection_pool, subscriber_id)
)]
#[tracing::instrument(name = "Mark subscriber as confirmed", skip(connection_pool))]
async fn confirm_subscriber(
connection_pool: &PgPool,
subscriber_id: &Uuid,
@@ -49,18 +44,11 @@ async fn confirm_subscriber(
subscriber_id
)
.execute(connection_pool)
.await
.map_err(|e| {
tracing::error!("Failed to execute query: {:?}", e);
e
})?;
.await?;
Ok(())
}
#[tracing::instrument(
name = "Get subscriber_id from token",
skip(connection, subscription_token)
)]
#[tracing::instrument(name = "Get subscriber id from token", skip(connection))]
async fn get_subscriber_id_from_token(
connection: &PgPool,
subscription_token: &str,
@@ -70,11 +58,7 @@ async fn get_subscriber_id_from_token(
subscription_token
)
.fetch_optional(connection)
.await
.map_err(|e| {
tracing::error!("Failed to execute query: {:?}", e);
e
})?;
.await?;
Ok(saved.map(|r| r.subscriber_id))
}

View File

@@ -1,11 +1,9 @@
use crate::{
domain::SubscriberEmail,
email_client::EmailClient,
routes::AppError,
routes::{AppError, not_found_html},
startup::AppState,
templates::{
MessageTemplate, NotFoundTemplate, UnsubscribeConfirmTemplate, UnsubscribeTemplate,
},
templates::{MessageTemplate, UnsubscribeConfirmTemplate, UnsubscribeTemplate},
};
use anyhow::Context;
use askama::Template;
@@ -14,7 +12,6 @@ use axum::{
extract::{Query, State},
response::{Html, IntoResponse, Response},
};
use reqwest::StatusCode;
use sqlx::{Executor, PgPool};
#[derive(serde::Deserialize)]
@@ -31,6 +28,10 @@ pub struct UnsubFormData {
email: String,
}
#[tracing::instrument(
name = "Removing subscriber from database",
skip(connection_pool, email_client, base_url)
)]
pub async fn post_unsubscribe(
State(AppState {
connection_pool,
@@ -48,13 +49,13 @@ pub async fn post_unsubscribe(
.await
.context("Failed to send a confirmation email.")?;
}
let template = MessageTemplate::Success {
message: "If you are a subscriber, you'll receive a confirmation link shortly.".into(),
};
let template = MessageTemplate::success(
"If you are a subscriber, you'll receive a confirmation link shortly.".into(),
);
Ok(Html(template.render().unwrap()).into_response())
}
#[tracing::instrument(name = "Fetching unsubscribe token from the database", skip_all)]
#[tracing::instrument(name = "Fetching unsubscribe token from database", skip_all)]
async fn fetch_unsubscribe_token(
connection_pool: &PgPool,
subscriber_email: &SubscriberEmail,
@@ -69,7 +70,7 @@ async fn fetch_unsubscribe_token(
Ok(r.and_then(|r| r.unsubscribe_token))
}
#[tracing::instrument(name = "Send an unsubscribe confirmation email", skip_all)]
#[tracing::instrument(name = "Send an confirmation email", skip_all)]
pub async fn send_unsubscribe_email(
email_client: &EmailClient,
subscriber_email: &SubscriberEmail,
@@ -81,15 +82,15 @@ pub async fn send_unsubscribe_email(
base_url, unsubscribe_token
);
let html_content = format!(
"You've requested to unsubscribe from my emails. To confirm, please click the link below:<br />\
<a href=\"{}\">Confirm unsubscribe</a><br />\
If you did not request this, you can safely ignore this email.",
r#"You've requested to unsubscribe from the newsletter. To confirm, please click the link below:<br />
<a href="{}">Confirm unsubscribe</a><br />
If you did not request this, you can safely ignore this email."#,
confirmation_link
);
let text_content = format!(
"You've requested to unsubscribe from my emails. To confirm, please follow the link below:\
{}\
If you did not request this, you can safely ignore this email.",
r#"You've requested to unsubscribe from the newsletter. To confirm, please follow the link below:
{}
If you did not request this, you can safely ignore this email."#,
confirmation_link
);
email_client
@@ -102,7 +103,7 @@ If you did not request this, you can safely ignore this email.",
.await
}
#[tracing::instrument(name = "Removing user from database if he exists", skip_all)]
#[tracing::instrument(name = "Removing user from database", skip(connection_pool))]
pub async fn unsubscribe_confirm(
Query(UnsubQueryParams { token }): Query<UnsubQueryParams>,
State(AppState {
@@ -120,11 +121,7 @@ pub async fn unsubscribe_confirm(
if result.rows_affected() == 0 {
tracing::info!("Unsubscribe token is not tied to any confirmed user");
Ok((
StatusCode::NOT_FOUND,
Html(NotFoundTemplate.render().unwrap()),
)
.into_response())
Ok(not_found_html())
} else {
tracing::info!("User successfully removed");
Ok(Html(UnsubscribeConfirmTemplate.render().unwrap()).into_response())

336
src/routes/users.rs Normal file
View File

@@ -0,0 +1,336 @@
use crate::authentication::AuthenticatedUser;
use crate::routes::{join_error_messages, verify_password};
use crate::session_state::TypedSession;
use crate::templates::{MessageTemplate, UserEditTemplate};
use crate::{
authentication::Role,
domain::{PostEntry, UserEntry},
routes::{AppError, not_found_html},
startup::AppState,
templates::{HtmlTemplate, UserTemplate},
};
use anyhow::Context;
use axum::{
Extension, Form,
extract::{Path, State},
response::{IntoResponse, Response},
};
use secrecy::{ExposeSecret, SecretString};
use sqlx::PgPool;
use uuid::Uuid;
use validator::Validate;
pub async fn user_edit_form(
Extension(AuthenticatedUser { user_id, .. }): Extension<AuthenticatedUser>,
State(AppState {
connection_pool, ..
}): State<AppState>,
) -> Result<Response, AppError> {
let user = sqlx::query_as!(
UserEntry,
r#"
SELECT user_id, username, role as "role: Role", full_name, bio, member_since
FROM users
WHERE user_id = $1
"#,
user_id
)
.fetch_one(&connection_pool)
.await
.context("Could not fetch user in database.")?;
let template = HtmlTemplate(UserEditTemplate { user });
Ok(template.into_response())
}
#[derive(Debug, Validate, serde::Deserialize)]
pub struct EditProfileForm {
user_id: Uuid,
#[validate(length(min = 3, message = "Username must be at least 3 characters."))]
username: String,
full_name: String,
bio: String,
}
#[tracing::instrument(name = "Updating user profile", skip_all, fields(user_id = %form.user_id))]
pub async fn update_user(
State(AppState {
connection_pool, ..
}): State<AppState>,
session: TypedSession,
Extension(AuthenticatedUser {
user_id: session_user_id,
username: session_username,
..
}): Extension<AuthenticatedUser>,
Form(form): Form<EditProfileForm>,
) -> Result<Response, AppError> {
if let Err(e) = form.validate().map_err(join_error_messages) {
let template = HtmlTemplate(MessageTemplate::error(e));
return Ok(template.into_response());
}
if form.user_id != session_user_id {
let template = HtmlTemplate(MessageTemplate::error(
"You are not authorized. Refresh the page and try again.".into(),
));
return Ok(template.into_response());
}
let updated_username = form.username.trim();
if updated_username != session_username
&& sqlx::query!(
"SELECT user_id FROM users WHERE username = $1",
updated_username
)
.fetch_optional(&connection_pool)
.await
.context("Could not fetch users table.")?
.is_some()
{
let template = HtmlTemplate(MessageTemplate::error(
"This username is already taken.".into(),
));
return Ok(template.into_response());
}
let updated_full_name = form.full_name.trim();
let bio = {
let bio = form.bio.trim();
if bio.is_empty() { None } else { Some(bio) }
};
sqlx::query!(
"
UPDATE users
SET username = $1, full_name = $2, bio = $3
WHERE user_id = $4
",
updated_username,
updated_full_name,
bio,
form.user_id
)
.execute(&connection_pool)
.await
.context("Failed to apply changes.")
.map_err(AppError::FormError)?;
session
.insert_username(updated_username.to_owned())
.await
.context("Could not update session username.")?;
let template = HtmlTemplate(MessageTemplate::success(
"Your profile has been updated.".into(),
));
Ok(template.into_response())
}
#[tracing::instrument(name = "Get users from database", skip(connection_pool))]
pub async fn get_users(connection_pool: &PgPool) -> Result<Vec<UserEntry>, sqlx::Error> {
sqlx::query_as!(
UserEntry,
r#"
SELECT user_id, username, role as "role: Role", full_name, bio, member_since
FROM users
ORDER BY member_since DESC
"#
)
.fetch_all(connection_pool)
.await
}
#[derive(Debug, serde::Deserialize)]
pub struct CreateUserForm {
username: String,
password: SecretString,
password_check: SecretString,
admin: Option<bool>,
}
struct NewUser {
username: String,
password_hash: SecretString,
role: Role,
}
impl TryFrom<CreateUserForm> for NewUser {
type Error = anyhow::Error;
fn try_from(value: CreateUserForm) -> Result<Self, Self::Error> {
if value.username.trim().is_empty() {
anyhow::bail!("Username cannot be empty.");
}
verify_password(value.password.expose_secret())?;
if value.password.expose_secret() != value.password_check.expose_secret() {
anyhow::bail!("Password mismatch.");
}
let role = match value.admin {
Some(true) => Role::Admin,
_ => Role::Writer,
};
let password_hash = crate::authentication::compute_pasword_hash(value.password)
.context("Failed to hash password.")?;
Ok(Self {
username: value.username,
password_hash,
role,
})
}
}
#[tracing::instrument(name = "Creating new user", skip_all, fields(username = %form.username))]
pub async fn create_user(
State(AppState {
connection_pool, ..
}): State<AppState>,
Form(form): Form<CreateUserForm>,
) -> Result<Response, AppError> {
let new_user: NewUser = match form.try_into().map_err(|e: anyhow::Error| e.to_string()) {
Err(e) => {
let template = HtmlTemplate(MessageTemplate::error(e));
return Ok(template.into_response());
}
Ok(new_user) => new_user,
};
insert_user(&connection_pool, new_user)
.await
.context("Could not insert user in database.")?;
let template = HtmlTemplate(MessageTemplate::success(
"The new user has been created.".into(),
));
Ok(template.into_response())
}
async fn insert_user(connection_pool: &PgPool, new_user: NewUser) -> Result<Uuid, sqlx::Error> {
let user_id = Uuid::new_v4();
sqlx::query!(
r#"
INSERT INTO users (user_id, username, password_hash, role)
VALUES ($1, $2, $3, $4)
"#,
user_id,
new_user.username,
new_user.password_hash.expose_secret(),
new_user.role as _
)
.execute(connection_pool)
.await?;
Ok(user_id)
}
#[derive(serde::Deserialize)]
pub struct SubscriberPathParams {
pub user_id: Uuid,
}
#[tracing::instrument(name = "Delete user from database", skip(connection_pool))]
pub async fn delete_user(
State(AppState {
connection_pool, ..
}): State<AppState>,
Path(SubscriberPathParams { user_id }): Path<SubscriberPathParams>,
) -> Result<Response, AppError> {
let result = sqlx::query!("DELETE FROM users WHERE user_id = $1", user_id)
.execute(&connection_pool)
.await
.context("Failed to delete user from database.")?;
let template = if result.rows_affected() == 0 {
HtmlTemplate(MessageTemplate::error(
"The user could not be deleted.".into(),
))
} else {
HtmlTemplate(MessageTemplate::success(
"The user has been deleted.".into(),
))
};
Ok(template.into_response())
}
#[tracing::instrument(name = "Fetching user data", skip(connection_pool, session))]
pub async fn user_profile(
session: TypedSession,
State(AppState {
connection_pool, ..
}): State<AppState>,
Path(username): Path<String>,
) -> Result<Response, AppError> {
match fetch_user_data(&connection_pool, &username)
.await
.context("Failed to fetch user data.")?
{
Some(user) => {
let posts = fetch_user_posts(&connection_pool, &user.user_id)
.await
.context("Could not fetch user posts.")?;
let session_user_id = session
.get_user_id()
.await
.context("Could not fetch session username.")?;
let profile_user_id =
sqlx::query!("SELECT user_id FROM users WHERE username = $1", username)
.fetch_one(&connection_pool)
.await
.context("Could not fetch profile user id.")?
.user_id;
let last_seen = sqlx::query!(
"
SELECT login_time FROM user_logins
WHERE user_id = $1
ORDER BY login_time DESC
",
profile_user_id
)
.fetch_optional(&connection_pool)
.await
.context("Failed to fetch last user login")?
.map(|r| r.login_time);
let template = HtmlTemplate(UserTemplate {
user,
session_user_id,
last_seen,
posts,
});
Ok(template.into_response())
}
None => {
tracing::error!(username = %username, "user not found");
Ok(not_found_html().into_response())
}
}
}
#[tracing::instrument(name = "Fetching user profile", skip_all)]
async fn fetch_user_data(
connection_pool: &PgPool,
username: &str,
) -> Result<Option<UserEntry>, sqlx::Error> {
sqlx::query_as!(
UserEntry,
r#"
SELECT user_id, username, full_name, role as "role: Role", member_since, bio
FROM users
WHERE username = $1
"#,
username
)
.fetch_optional(connection_pool)
.await
}
#[tracing::instrument(name = "Fetching user posts", skip_all)]
async fn fetch_user_posts(
connection_pool: &PgPool,
user_id: &Uuid,
) -> Result<Vec<PostEntry>, sqlx::Error> {
sqlx::query_as!(
PostEntry,
r#"
SELECT p.author_id, u.username as author, u.full_name,
p.post_id, p.title, p.content, p.published_at, p.last_modified
FROM posts p
INNER JOIN users u ON p.author_id = u.user_id
WHERE p.author_id = $1
ORDER BY p.published_at DESC
"#,
user_id
)
.fetch_all(connection_pool)
.await
}

View File

@@ -3,6 +3,8 @@ use std::result;
use tower_sessions::{Session, session::Error};
use uuid::Uuid;
use crate::authentication::Role;
pub struct TypedSession(Session);
type Result<T> = result::Result<T, Error>;
@@ -10,6 +12,7 @@ type Result<T> = result::Result<T, Error>;
impl TypedSession {
const USER_ID_KEY: &'static str = "user_id";
const USERNAME_KEY: &'static str = "username";
const ROLE_KEY: &'static str = "role";
pub async fn renew(&self) -> Result<()> {
self.0.cycle_id().await
@@ -31,6 +34,23 @@ impl TypedSession {
self.0.get(Self::USERNAME_KEY).await
}
pub async fn insert_role(&self, role: Role) -> Result<()> {
self.0.insert(Self::ROLE_KEY, role).await
}
pub async fn get_role(&self) -> Result<Option<Role>> {
self.0.get(Self::ROLE_KEY).await
}
pub async fn has_admin_permissions(&self) -> Result<bool> {
let role = self.0.get(Self::ROLE_KEY).await?;
if let Some(Role::Admin) = role {
Ok(true)
} else {
Ok(false)
}
}
pub async fn clear(&self) {
self.0.clear().await;
}

View File

@@ -1,21 +1,21 @@
use crate::{configuration::Settings, email_client::EmailClient, routes::require_auth, routes::*};
use anyhow::Context;
use axum::{
Router,
body::Bytes,
extract::MatchedPath,
http::Request,
middleware,
routing::{get, post},
response::{IntoResponse, Response},
routing::{delete, get, post, put},
};
use axum_server::tls_rustls::RustlsConfig;
use secrecy::ExposeSecret;
use reqwest::{StatusCode, header};
use sqlx::{PgPool, postgres::PgPoolOptions};
use std::{net::TcpListener, sync::Arc, time::Duration};
use std::{sync::Arc, time::Duration};
use tokio::net::TcpListener;
use tower_http::{services::ServeDir, trace::TraceLayer};
use tower_sessions::SessionManagerLayer;
use tower_sessions_redis_store::{
RedisStore,
fred::prelude::{ClientLike, Config, Pool},
};
use tower_sessions_redis_store::{RedisStore, fred::prelude::Pool};
use uuid::Uuid;
#[derive(Clone)]
@@ -28,7 +28,6 @@ pub struct AppState {
pub struct Application {
listener: TcpListener,
router: Router,
tls_config: Option<RustlsConfig>,
}
impl Application {
@@ -43,57 +42,26 @@ impl Application {
))
.connect_lazy_with(configuration.database.with_db());
let email_client = EmailClient::build(configuration.email_client).unwrap();
let pool = Pool::new(
Config::from_url(configuration.redis_uri.expose_secret())
.expect("Failed to parse Redis URL string"),
None,
None,
None,
6,
)
.unwrap();
pool.connect();
pool.wait_for_connect().await.unwrap();
let redis_store = RedisStore::new(pool);
let redis_store = configuration
.kv_store
.session_store()
.await
.context("Failed to acquire Redis session store.")?;
let router = app(
connection_pool,
email_client,
configuration.application.base_url,
redis_store,
);
let tls_config = if configuration.require_tls {
Some(
RustlsConfig::from_pem_file(
std::env::var("APP_TLS_CERT")
.expect("Failed to read TLS certificate environment variable"),
std::env::var("APP_TLS_KEY")
.expect("Feiled to read TLS private key environment variable"),
)
.await
.expect("Could not create TLS configuration"),
)
} else {
None
};
let listener = TcpListener::bind(address).unwrap();
Ok(Self {
listener,
router,
tls_config,
})
let listener = TcpListener::bind(address)
.await
.context("Could not bind application address.")?;
Ok(Self { listener, router })
}
pub async fn run_until_stopped(self) -> Result<(), std::io::Error> {
tracing::debug!("Listening on {}", self.local_addr());
if let Some(tls_config) = self.tls_config {
axum_server::from_tcp_rustls(self.listener, tls_config)
.serve(self.router.into_make_service())
.await
} else {
axum_server::from_tcp(self.listener)
.serve(self.router.into_make_service())
.await
}
tracing::debug!("listening on {}", self.local_addr());
axum::serve(self.listener, self.router).await
}
pub fn local_addr(&self) -> String {
@@ -117,14 +85,26 @@ pub fn app(
base_url,
};
let admin_routes = Router::new()
.route("/subscribers", get(get_subscribers_page))
.route("/subscribers/{subscriber_id}", delete(delete_subscriber))
.route("/posts", get(get_posts_page_dashboard))
.route("/posts/{post_id}", delete(delete_post))
.route("/users", post(create_user))
.route("/users/{user_id}", delete(delete_user))
.route("/comments", get(get_all_comments))
.route("/comments/{comment_id}", delete(delete_comment))
.layer(middleware::from_fn(require_admin));
let auth_routes = Router::new()
.route("/dashboard", get(admin_dashboard))
.route("/password", post(change_password))
.route("/newsletters", post(publish_newsletter))
.route("/posts", post(create_post))
.route("/logout", post(logout))
.route("/posts/{post_id}", put(update_post))
.route("/logout", get(logout))
.route("/users/edit", get(user_edit_form).put(update_user))
.nest("/admin", admin_routes)
.layer(middleware::from_fn(require_auth));
Router::new()
.nest_service("/assets", ServeDir::new("assets"))
.route("/", get(home))
.route("/login", get(get_login).post(post_login))
.route("/health_check", get(health_check))
@@ -133,8 +113,16 @@ pub fn app(
.route("/unsubscribe", get(get_unsubscribe).post(post_unsubscribe))
.route("/unsubscribe/confirm", get(unsubscribe_confirm))
.route("/posts", get(list_posts))
.route("/posts/load_more", get(load_more))
.route("/posts/{post_id}", get(see_post))
.nest("/admin", admin_routes)
.route(
"/posts/{post_id}/comments",
post(post_comment).get(get_comments),
)
.route("/users/{username}", get(user_profile))
.route("/favicon.ico", get(favicon))
.merge(auth_routes)
.nest_service("/assets", ServeDir::new("assets"))
.layer(
TraceLayer::new_for_http().make_span_with(|request: &Request<_>| {
let matched_path = request
@@ -148,7 +136,6 @@ pub fn app(
method = ?request.method(),
matched_path,
request_id,
some_other_field = tracing::field::Empty,
)
}),
)
@@ -156,3 +143,22 @@ pub fn app(
.fallback(not_found)
.with_state(app_state)
}
pub async fn favicon() -> Response {
match tokio::fs::read("assets/favicon.png").await {
Ok(content) => {
let bytes = Bytes::from(content);
let body = bytes.into();
Response::builder()
.status(StatusCode::OK)
.header(header::CONTENT_TYPE, "image/x-icon")
.header(header::CACHE_CONTROL, "public, max-age=86400")
.body(body)
.unwrap()
}
Err(e) => {
tracing::error!(error = %e, "Error while reading favicon.ico.");
StatusCode::NOT_FOUND.into_response()
}
}
}

View File

@@ -1,12 +1,14 @@
use tokio::task::JoinHandle;
use tracing_bunyan_formatter::{BunyanFormattingLayer, JsonStorageLayer};
use tracing_subscriber::{fmt::MakeWriter, layer::SubscriberExt, util::SubscriberInitExt};
use tracing_subscriber::{
fmt::{MakeWriter, format::FmtSpan},
layer::SubscriberExt,
util::SubscriberInitExt,
};
pub fn init_subscriber<Sink>(sink: Sink)
where
Sink: for<'a> MakeWriter<'a> + Send + Sync + 'static,
{
let formatting_layer = BunyanFormattingLayer::new(env!("CARGO_CRATE_NAME").into(), sink);
tracing_subscriber::registry()
.with(
tracing_subscriber::EnvFilter::try_from_default_env().unwrap_or_else(|_| {
@@ -17,8 +19,12 @@ where
.into()
}),
)
.with(JsonStorageLayer)
.with(formatting_layer)
.with(
tracing_subscriber::fmt::layer()
.pretty()
.with_writer(sink)
.with_span_events(FmtSpan::NEW | FmtSpan::CLOSE),
)
.init();
}

View File

@@ -1,65 +1,178 @@
use crate::domain::PostEntry;
use crate::{
authentication::AuthenticatedUser,
domain::{CommentEntry, PostEntry, SubscriberEntry, UserEntry},
routes::{AppError, DashboardStats},
};
use askama::Template;
use axum::response::{Html, IntoResponse};
use chrono::{DateTime, Utc};
use uuid::Uuid;
#[derive(Template)]
pub enum MessageTemplate {
#[template(path = "../templates/success.html")]
Success { message: String },
#[template(path = "../templates/error.html")]
Error { message: String },
pub struct HtmlTemplate<T>(pub T);
impl<T> IntoResponse for HtmlTemplate<T>
where
T: Template,
{
fn into_response(self) -> axum::response::Response {
match self.0.render() {
Ok(html) => Html(html).into_response(),
Err(err) => AppError::unexpected_page(anyhow::anyhow!(err)).into_response(),
}
}
}
#[derive(Template)]
#[template(path = "../templates/500.html")]
pub struct InternalErrorTemplate;
#[template(path = "user/profile.html")]
pub struct UserTemplate {
pub user: UserEntry,
pub session_user_id: Option<Uuid>,
pub posts: Vec<PostEntry>,
pub last_seen: Option<DateTime<Utc>>,
}
#[derive(Template)]
#[template(path = "user/edit.html")]
pub struct UserEditTemplate {
pub user: UserEntry,
}
#[derive(Template)]
#[template(path = "message.html")]
pub struct MessageTemplate {
pub message: String,
pub error: bool,
}
impl MessageTemplate {
pub fn success(message: String) -> Self {
Self {
message,
error: false,
}
}
pub fn error(message: String) -> Self {
Self {
message,
error: true,
}
}
}
#[derive(Template)]
#[template(path = "../templates/login.html")]
pub struct LoginTemplate;
#[derive(Template)]
#[template(path = "../templates/dashboard.html")]
#[template(path = "dashboard/dashboard.html")]
pub struct DashboardTemplate {
pub username: String,
pub user: AuthenticatedUser,
pub idempotency_key_1: String,
pub idempotency_key_2: String,
pub stats: DashboardStats,
pub subscribers: Vec<SubscriberEntry>,
pub current_page: i64,
pub max_page: i64,
pub count: i64,
pub users: Vec<UserEntry>,
pub posts: Vec<PostEntry>,
pub posts_current_page: i64,
pub posts_max_page: i64,
pub posts_count: i64,
pub comments: Vec<CommentEntry>,
pub comments_current_page: i64,
pub comments_max_page: i64,
pub comments_count: i64,
}
#[derive(Template)]
#[template(path = "../templates/home.html")]
#[template(path = "dashboard/posts/list.html", block = "posts")]
pub struct PostsPageDashboardTemplate {
pub posts: Vec<PostEntry>,
pub posts_current_page: i64,
pub posts_max_page: i64,
}
#[derive(Template)]
#[template(path = "dashboard/comments/list.html", block = "comments")]
pub struct CommentsPageDashboardTemplate {
pub comments: Vec<CommentEntry>,
pub comments_current_page: i64,
pub comments_max_page: i64,
}
#[derive(Template)]
#[template(path = "home.html")]
pub struct HomeTemplate;
#[derive(Template)]
#[template(path = "../templates/posts.html")]
#[template(path = "posts/list.html")]
pub struct PostsTemplate {
pub posts: Vec<PostEntry>,
pub next_page: Option<i64>,
}
#[derive(Template)]
#[template(path = "../templates/post.html")]
#[template(path = "posts/list.html", block = "posts")]
pub struct PostListTemplate {
pub posts: Vec<PostEntry>,
pub next_page: Option<i64>,
}
#[derive(Template)]
#[template(path = "posts/page.html")]
pub struct PostTemplate {
pub post: PostEntry,
pub post_html: String,
pub idempotency_key: String,
pub comments: Vec<CommentEntry>,
pub current_page: i64,
pub max_page: i64,
pub comments_count: i64,
pub session_user_id: Option<Uuid>,
pub session_username: Option<String>,
}
#[derive(Template)]
#[template(path = "../templates/confirm.html")]
#[template(path = "posts/comments/list.html", block = "comments")]
pub struct CommentsList {
pub comments: Vec<CommentEntry>,
pub current_page: i64,
pub max_page: i64,
}
#[derive(Template)]
#[template(path = "dashboard/subscribers/list.html", block = "subs")]
pub struct SubListTemplate {
pub subscribers: Vec<SubscriberEntry>,
pub current_page: i64,
pub max_page: i64,
}
#[derive(Template)]
#[template(path = "subscribe/confirm.html")]
pub struct ConfirmTemplate;
#[derive(Template)]
#[template(path = "../templates/404.html")]
pub struct NotFoundTemplate;
pub enum ErrorTemplate {
#[template(path = "error/404.html")]
NotFound,
#[template(path = "error/500.html")]
InternalServer,
#[template(path = "error/403.html")]
Forbidden,
}
#[derive(Template)]
#[template(path = "../templates/unsubscribe_confirm.html")]
#[template(path = "unsubscribe/confirm.html")]
pub struct UnsubscribeConfirmTemplate;
#[derive(Template)]
#[template(path = "../templates/unsubscribe.html")]
#[template(path = "unsubscribe/form.html")]
pub struct UnsubscribeTemplate;
#[derive(Template)]
#[template(path = "../templates/email/new_post.html")]
#[template(path = "email/new_post.html")]
pub struct NewPostEmailTemplate<'a> {
pub base_url: &'a str,
pub post_title: &'a str,
@@ -68,7 +181,7 @@ pub struct NewPostEmailTemplate<'a> {
}
#[derive(Template)]
#[template(path = "../templates/email/standalone.html")]
#[template(path = "email/standalone.html")]
pub struct StandaloneEmailTemplate<'a> {
pub base_url: &'a str,
pub html_content: &'a str,
@@ -94,7 +207,7 @@ I just published a new post that I think you'll find interesting:
"{}"
Read the full post: {}/posts/{}
Read the full post: {}/posts/{}?origin=EMAIL_ID
This post covers practical insights and real-world examples that I hope will be valuable for your backend development journey.

Some files were not shown because too many files have changed in this diff Show More