Compare commits

...

13 Commits

Author SHA1 Message Date
Elisiário Couto
cecde13486 fix: Resolve test isolation and mypy type checking issues.
- Fix test_database_service.py fixture using wrong attribute name (_db_path instead of _database_path)
- Add proper SQLAlchemy imports (and_, desc) for type-safe query construction
- Replace .desc() method calls with desc() function and col() wrapper
- Fix join condition to use and_() instead of boolean operators
- Fix tuple access in select queries with multiple columns
- All 109 tests passing, mypy clean, no regressions
2025-10-01 00:44:14 +01:00
Elisiário Couto
dc7aed316d refactor: Migrate database service to SQLModel and Alembic.
- Add SQLModel for type-safe database models
- Implement Alembic for schema migration management
- Create 7 migrations covering all existing schema changes
- Add automatic migration system that runs on startup
- Maintain backward compatibility with existing raw SQL queries
- Remove old manual migration system
- All tests pass (109 tests)

Benefits:
- Full type safety with Pydantic validation
- Version-controlled schema changes
- Automatic migration detection and application
- Better developer experience with typed models
2025-09-30 23:34:48 +01:00
Elisiário Couto
5465941058 chore: Simplify database service. 2025-09-30 22:15:01 +01:00
Elisiário Couto
e6da6ee9ab chore(ci): Bump version to 2025.9.26 2025-09-30 14:09:57 +01:00
Elisiário Couto
8802d24789 debug: Log different sets of GoCardless rate limits. 2025-09-30 14:07:10 +01:00
Elisiário Couto
d3954f079b chore(ci): Bump version to 2025.9.25 2025-09-30 10:49:00 +01:00
Elisiário Couto
0b68038739 Lock dependencies before commiting next version. 2025-09-30 10:48:49 +01:00
Elisiário Couto
d36568da54 chore: Log more rate limit headers. 2025-09-30 10:46:13 +01:00
Elisiário Couto
473f126d3e feat(frontend): Add ability to list backups and create a backup on demand. 2025-09-28 23:23:44 +01:00
Elisiário Couto
222bb2ec64 Lint and reformat. 2025-09-28 23:23:44 +01:00
Elisiário Couto
22ec0e36b1 fix(api): Fix S3 backup path-style configuration and improve UX.
- Fix critical S3 client configuration bug for path-style addressing
- Add toast notifications for better user feedback on S3 config operations
- Set up Toaster component in root layout for app-wide notifications
- Clean up unused imports in test files

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-28 23:23:44 +01:00
copilot-swe-agent[bot]
0122913052 feat(frontend): Add S3 backup UI and complete backup functionality
Co-authored-by: elisiariocouto <818914+elisiariocouto@users.noreply.github.com>
2025-09-28 23:23:44 +01:00
copilot-swe-agent[bot]
7f2a4634c5 feat(api): Add S3 backup functionality to backend
Co-authored-by: elisiariocouto <818914+elisiariocouto@users.noreply.github.com>
2025-09-28 23:23:44 +01:00
38 changed files with 3379 additions and 1397 deletions

View File

@@ -1,4 +1,32 @@
## 2025.9.26 (2025/09/30)
### Debug
- Log different sets of GoCardless rate limits. ([8802d247](https://github.com/elisiariocouto/leggen/commit/8802d24789cbb8e854d857a0d7cc89a25a26f378))
## 2025.9.25 (2025/09/30)
### Bug Fixes
- **api:** Fix S3 backup path-style configuration and improve UX. ([22ec0e36](https://github.com/elisiariocouto/leggen/commit/22ec0e36b11e5b017075bee51de0423a53ec4648))
### Features
- **api:** Add S3 backup functionality to backend ([7f2a4634](https://github.com/elisiariocouto/leggen/commit/7f2a4634c51814b6785433a25ce42d20aea0558c))
- **frontend:** Add S3 backup UI and complete backup functionality ([01229130](https://github.com/elisiariocouto/leggen/commit/0122913052793bcbf011cb557ef182be21c5de93))
- **frontend:** Add ability to list backups and create a backup on demand. ([473f126d](https://github.com/elisiariocouto/leggen/commit/473f126d3e699521172539f2ca0bff0579ccee51))
### Miscellaneous Tasks
- Log more rate limit headers. ([d36568da](https://github.com/elisiariocouto/leggen/commit/d36568da540d4fb4ae1fa10b322a3fa77dcc5360))
## 2025.9.24 (2025/09/25)
### Features

148
alembic.ini Normal file
View File

@@ -0,0 +1,148 @@
# A generic, single database configuration.
[alembic]
# path to migration scripts.
# this is typically a path given in POSIX (e.g. forward slashes)
# format, relative to the token %(here)s which refers to the location of this
# ini file
script_location = %(here)s/alembic
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
# sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory. for multiple paths, the path separator
# is defined by "path_separator" below.
prepend_sys_path = .
# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
# Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to ZoneInfo()
# leave blank for localtime
# timezone =
# max length of characters to apply to the "slug" field
# truncate_slug_length = 40
# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false
# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false
# version location specification; This defaults
# to <script_location>/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "path_separator"
# below.
# version_locations = %(here)s/bar:%(here)s/bat:%(here)s/alembic/versions
# path_separator; This indicates what character is used to split lists of file
# paths, including version_locations and prepend_sys_path within configparser
# files such as alembic.ini.
# The default rendered in new alembic.ini files is "os", which uses os.pathsep
# to provide os-dependent path splitting.
#
# Note that in order to support legacy alembic.ini files, this default does NOT
# take place if path_separator is not present in alembic.ini. If this
# option is omitted entirely, fallback logic is as follows:
#
# 1. Parsing of the version_locations option falls back to using the legacy
# "version_path_separator" key, which if absent then falls back to the legacy
# behavior of splitting on spaces and/or commas.
# 2. Parsing of the prepend_sys_path option falls back to the legacy
# behavior of splitting on spaces, commas, or colons.
#
# Valid values for path_separator are:
#
# path_separator = :
# path_separator = ;
# path_separator = space
# path_separator = newline
#
# Use os.pathsep. Default configuration used for new projects.
path_separator = os
# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false
# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8
# database URL. This is consumed by the user-maintained env.py script only.
# other means of configuring database URLs may be customized within the env.py
# file.
# Note: The actual URL is configured programmatically in env.py
# sqlalchemy.url = driver://user:pass@localhost/dbname
[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks = black
# black.type = console_scripts
# black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME
# lint with attempts to fix using "ruff" - use the module runner, against the "ruff" module
# hooks = ruff
# ruff.type = module
# ruff.module = ruff
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
# Alternatively, use the exec runner to execute a binary found on your PATH
# hooks = ruff
# ruff.type = exec
# ruff.executable = ruff
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
# Logging configuration. This is also consumed by the user-maintained
# env.py script only.
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARNING
handlers = console
qualname =
[logger_sqlalchemy]
level = WARNING
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S

1
alembic/README Normal file
View File

@@ -0,0 +1 @@
Generic single-database configuration.

78
alembic/env.py Normal file
View File

@@ -0,0 +1,78 @@
from logging.config import fileConfig
from sqlalchemy import engine_from_config, pool
from alembic import context
from leggen.models.database import SQLModel
from leggen.services.database import get_database_url
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
# Set the database URL from our configuration
config.set_main_option("sqlalchemy.url", get_database_url())
# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# add your model's MetaData object here
# for 'autogenerate' support
target_metadata = SQLModel.metadata
# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode.
This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.
Calls to context.execute() here emit the given string to the
script output.
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online() -> None:
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()

28
alembic/script.py.mako Normal file
View File

@@ -0,0 +1,28 @@
"""${message}
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from typing import Sequence, Union
from alembic import op
import sqlalchemy as sa
${imports if imports else ""}
# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
def upgrade() -> None:
"""Upgrade schema."""
${upgrades if upgrades else "pass"}
def downgrade() -> None:
"""Downgrade schema."""
${downgrades if downgrades else "pass"}

View File

@@ -0,0 +1,102 @@
"""migrate_to_composite_key
Migrate transactions table to use composite primary key (accountId, transactionId).
Revision ID: 1ba02efe481c
Revises: bf30246cb723
Create Date: 2025-09-30 23:16:34.637762
"""
from typing import Sequence, Union
from sqlalchemy import text
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "1ba02efe481c"
down_revision: Union[str, Sequence[str], None] = "bf30246cb723"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Migrate to composite primary key."""
conn = op.get_bind()
# Check if migration is needed
result = conn.execute(
text("""
SELECT name FROM sqlite_master
WHERE type='table' AND name='transactions'
""")
)
if not result.fetchone():
return
# Create temporary table with new schema
op.execute("""
CREATE TABLE transactions_temp (
accountId TEXT NOT NULL,
transactionId TEXT NOT NULL,
internalTransactionId TEXT,
institutionId TEXT NOT NULL,
iban TEXT,
transactionDate DATETIME,
description TEXT,
transactionValue REAL,
transactionCurrency TEXT,
transactionStatus TEXT,
rawTransaction JSON NOT NULL,
PRIMARY KEY (accountId, transactionId)
)
""")
# Insert deduplicated data (keep most recent duplicate)
op.execute("""
INSERT INTO transactions_temp
SELECT
accountId,
json_extract(rawTransaction, '$.transactionId') as transactionId,
internalTransactionId,
institutionId,
iban,
transactionDate,
description,
transactionValue,
transactionCurrency,
transactionStatus,
rawTransaction
FROM (
SELECT *,
ROW_NUMBER() OVER (
PARTITION BY accountId, json_extract(rawTransaction, '$.transactionId')
ORDER BY transactionDate DESC, rowid DESC
) as rn
FROM transactions
WHERE json_extract(rawTransaction, '$.transactionId') IS NOT NULL
AND accountId IS NOT NULL
) WHERE rn = 1
""")
# Replace tables
op.execute("DROP TABLE transactions")
op.execute("ALTER TABLE transactions_temp RENAME TO transactions")
# Recreate indexes
op.create_index(
"idx_transactions_internal_id", "transactions", ["internalTransactionId"]
)
op.create_index("idx_transactions_date", "transactions", ["transactionDate"])
op.create_index(
"idx_transactions_account_date",
"transactions",
["accountId", "transactionDate"],
)
op.create_index("idx_transactions_amount", "transactions", ["transactionValue"])
def downgrade() -> None:
"""Not implemented - would require changing primary key back."""

View File

@@ -0,0 +1,56 @@
"""add_transaction_enrichments_table
Add transaction_enrichments table for storing enriched transaction data.
Revision ID: 4819c868ebc1
Revises: dd9f6a55604c
Create Date: 2025-09-30 23:20:00.969614
"""
from typing import Sequence, Union
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "4819c868ebc1"
down_revision: Union[str, Sequence[str], None] = "dd9f6a55604c"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Create transaction_enrichments table."""
op.create_table(
"transaction_enrichments",
sa.Column("accountId", sa.String(), nullable=False),
sa.Column("transactionId", sa.String(), nullable=False),
sa.Column("clean_name", sa.String(), nullable=True),
sa.Column("category", sa.String(), nullable=True),
sa.Column("logo_url", sa.String(), nullable=True),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.Column("updated_at", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(
["accountId", "transactionId"],
["transactions.accountId", "transactions.transactionId"],
ondelete="CASCADE",
),
sa.PrimaryKeyConstraint("accountId", "transactionId"),
)
# Create indexes
op.create_index(
"idx_transaction_enrichments_category", "transaction_enrichments", ["category"]
)
op.create_index(
"idx_transaction_enrichments_clean_name",
"transaction_enrichments",
["clean_name"],
)
def downgrade() -> None:
"""Drop transaction_enrichments table."""
op.drop_table("transaction_enrichments")

View File

@@ -0,0 +1,33 @@
"""add_display_name_column
Add display_name column to accounts table.
Revision ID: be8d5807feca
Revises: 1ba02efe481c
Create Date: 2025-09-30 23:16:34.929968
"""
from typing import Sequence, Union
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "be8d5807feca"
down_revision: Union[str, Sequence[str], None] = "1ba02efe481c"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Add display_name column to accounts table."""
with op.batch_alter_table("accounts", schema=None) as batch_op:
batch_op.add_column(sa.Column("display_name", sa.String(), nullable=True))
def downgrade() -> None:
"""Remove display_name column."""
with op.batch_alter_table("accounts", schema=None) as batch_op:
batch_op.drop_column("display_name")

View File

@@ -0,0 +1,62 @@
"""migrate_balance_timestamps
Convert Unix timestamps to datetime strings in balances table.
Revision ID: bf30246cb723
Revises: de8bfb1169d4
Create Date: 2025-09-30 23:14:03.128959
"""
from datetime import datetime
from typing import Sequence, Union
from sqlalchemy import text
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "bf30246cb723"
down_revision: Union[str, Sequence[str], None] = "de8bfb1169d4"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Convert all Unix timestamps to datetime strings."""
conn = op.get_bind()
# Get all balances with REAL timestamps
result = conn.execute(
text("""
SELECT id, timestamp
FROM balances
WHERE typeof(timestamp) = 'real'
ORDER BY id
""")
)
unix_records = result.fetchall()
if not unix_records:
return
# Convert and update in batches
for record_id, unix_timestamp in unix_records:
try:
# Convert Unix timestamp to datetime string
dt_string = datetime.fromtimestamp(float(unix_timestamp)).isoformat()
# Update the record
conn.execute(
text("UPDATE balances SET timestamp = :dt WHERE id = :id"),
{"dt": dt_string, "id": record_id},
)
except Exception:
continue
conn.commit()
def downgrade() -> None:
"""Not implemented - converting back would lose precision."""

View File

@@ -0,0 +1,33 @@
"""add_logo_column
Add logo column to accounts table.
Revision ID: dd9f6a55604c
Revises: f854fd498a6e
Create Date: 2025-09-30 23:16:35.530858
"""
from typing import Sequence, Union
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "dd9f6a55604c"
down_revision: Union[str, Sequence[str], None] = "f854fd498a6e"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Add logo column to accounts table."""
with op.batch_alter_table("accounts", schema=None) as batch_op:
batch_op.add_column(sa.Column("logo", sa.String(), nullable=True))
def downgrade() -> None:
"""Remove logo column."""
with op.batch_alter_table("accounts", schema=None) as batch_op:
batch_op.drop_column("logo")

View File

@@ -0,0 +1,95 @@
"""create_initial_tables
Revision ID: de8bfb1169d4
Revises:
Create Date: 2025-09-30 23:09:24.255875
"""
from typing import Sequence, Union
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "de8bfb1169d4"
down_revision: Union[str, Sequence[str], None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Create initial database tables."""
# Create accounts table
op.create_table(
"accounts",
sa.Column("id", sa.String(), nullable=False),
sa.Column("institution_id", sa.String(), nullable=False),
sa.Column("status", sa.String(), nullable=False),
sa.Column("iban", sa.String(), nullable=True),
sa.Column("name", sa.String(), nullable=True),
sa.Column("currency", sa.String(), nullable=True),
sa.Column("created", sa.DateTime(), nullable=False),
sa.Column("last_accessed", sa.DateTime(), nullable=True),
sa.Column("last_updated", sa.DateTime(), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
op.create_index("idx_accounts_institution_id", "accounts", ["institution_id"])
op.create_index("idx_accounts_status", "accounts", ["status"])
# Create balances table
op.create_table(
"balances",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("account_id", sa.String(), nullable=False),
sa.Column("bank", sa.String(), nullable=False),
sa.Column("status", sa.String(), nullable=False),
sa.Column("iban", sa.String(), nullable=False),
sa.Column("amount", sa.Float(), nullable=False),
sa.Column("currency", sa.String(), nullable=False),
sa.Column("type", sa.String(), nullable=False),
sa.Column("timestamp", sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint("id"),
)
op.create_index("idx_balances_account_id", "balances", ["account_id"])
op.create_index("idx_balances_timestamp", "balances", ["timestamp"])
op.create_index(
"idx_balances_account_type_timestamp",
"balances",
["account_id", "type", "timestamp"],
)
# Create transactions table (old schema with internalTransactionId as PK)
op.create_table(
"transactions",
sa.Column("accountId", sa.String(), nullable=False),
sa.Column("transactionId", sa.String(), nullable=False),
sa.Column("internalTransactionId", sa.String(), nullable=True),
sa.Column("institutionId", sa.String(), nullable=False),
sa.Column("iban", sa.String(), nullable=True),
sa.Column("transactionDate", sa.DateTime(), nullable=True),
sa.Column("description", sa.String(), nullable=True),
sa.Column("transactionValue", sa.Float(), nullable=True),
sa.Column("transactionCurrency", sa.String(), nullable=True),
sa.Column("transactionStatus", sa.String(), nullable=True),
sa.Column("rawTransaction", sa.JSON(), nullable=False),
sa.PrimaryKeyConstraint("internalTransactionId"),
)
op.create_index(
"idx_transactions_internal_id", "transactions", ["internalTransactionId"]
)
op.create_index("idx_transactions_date", "transactions", ["transactionDate"])
op.create_index(
"idx_transactions_account_date",
"transactions",
["accountId", "transactionDate"],
)
op.create_index("idx_transactions_amount", "transactions", ["transactionValue"])
def downgrade() -> None:
"""Drop initial tables."""
op.drop_table("transactions")
op.drop_table("balances")
op.drop_table("accounts")

View File

@@ -0,0 +1,59 @@
"""add_sync_operations_table
Add sync_operations table for tracking synchronization operations.
Revision ID: f854fd498a6e
Revises: be8d5807feca
Create Date: 2025-09-30 23:16:35.229062
"""
from typing import Sequence, Union
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision: str = "f854fd498a6e"
down_revision: Union[str, Sequence[str], None] = "be8d5807feca"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None
def upgrade() -> None:
"""Create sync_operations table."""
op.create_table(
"sync_operations",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("started_at", sa.DateTime(), nullable=False),
sa.Column("completed_at", sa.DateTime(), nullable=True),
sa.Column("success", sa.Boolean(), nullable=True),
sa.Column(
"accounts_processed", sa.Integer(), nullable=False, server_default="0"
),
sa.Column(
"transactions_added", sa.Integer(), nullable=False, server_default="0"
),
sa.Column(
"transactions_updated", sa.Integer(), nullable=False, server_default="0"
),
sa.Column("balances_updated", sa.Integer(), nullable=False, server_default="0"),
sa.Column("duration_seconds", sa.Float(), nullable=True),
sa.Column("errors", sa.String(), nullable=True),
sa.Column("logs", sa.String(), nullable=True),
sa.Column("trigger_type", sa.String(), nullable=False, server_default="manual"),
sa.PrimaryKeyConstraint("id"),
)
# Create indexes
op.create_index("idx_sync_operations_started_at", "sync_operations", ["started_at"])
op.create_index("idx_sync_operations_success", "sync_operations", ["success"])
op.create_index(
"idx_sync_operations_trigger_type", "sync_operations", ["trigger_type"]
)
def downgrade() -> None:
"""Drop sync_operations table."""
op.drop_table("sync_operations")

View File

@@ -28,3 +28,13 @@ enabled = true
[filters]
case_insensitive = ["salary", "utility"]
case_sensitive = ["SpecificStore"]
# Optional: S3 backup configuration
[backup.s3]
access_key_id = "your-s3-access-key"
secret_access_key = "your-s3-secret-key"
bucket_name = "your-bucket-name"
region = "us-east-1"
# endpoint_url = "https://custom-s3-endpoint.com" # Optional: for custom S3-compatible endpoints
path_style = false # Set to true for path-style addressing
enabled = true

View File

@@ -0,0 +1,273 @@
import { useState, useEffect } from "react";
import { useMutation, useQueryClient } from "@tanstack/react-query";
import { Cloud, TestTube } from "lucide-react";
import { toast } from "sonner";
import { apiClient } from "../lib/api";
import { Button } from "./ui/button";
import { Input } from "./ui/input";
import { Label } from "./ui/label";
import { Switch } from "./ui/switch";
import { EditButton } from "./ui/edit-button";
import {
Drawer,
DrawerClose,
DrawerContent,
DrawerDescription,
DrawerFooter,
DrawerHeader,
DrawerTitle,
DrawerTrigger,
} from "./ui/drawer";
import type { BackupSettings, S3Config } from "../types/api";
interface S3BackupConfigDrawerProps {
settings?: BackupSettings;
trigger?: React.ReactNode;
}
export default function S3BackupConfigDrawer({
settings,
trigger,
}: S3BackupConfigDrawerProps) {
const [open, setOpen] = useState(false);
const [config, setConfig] = useState<S3Config>({
access_key_id: "",
secret_access_key: "",
bucket_name: "",
region: "us-east-1",
endpoint_url: "",
path_style: false,
enabled: true,
});
const queryClient = useQueryClient();
useEffect(() => {
if (settings?.s3) {
setConfig({ ...settings.s3 });
}
}, [settings]);
const updateMutation = useMutation({
mutationFn: (s3Config: S3Config) =>
apiClient.updateBackupSettings({
s3: s3Config,
}),
onSuccess: () => {
queryClient.invalidateQueries({ queryKey: ["backupSettings"] });
setOpen(false);
toast.success("S3 backup configuration saved successfully");
},
onError: (error: Error & { response?: { data?: { detail?: string } } }) => {
console.error("Failed to update S3 backup configuration:", error);
const message =
error?.response?.data?.detail ||
"Failed to save S3 configuration. Please check your settings and try again.";
toast.error(message);
},
});
const testMutation = useMutation({
mutationFn: () =>
apiClient.testBackupConnection({
service: "s3",
config: config,
}),
onSuccess: (response) => {
if (response.success) {
console.log("S3 connection test successful");
toast.success(
"S3 connection test successful! Your configuration is working correctly.",
);
} else {
console.error("S3 connection test failed:", response.message);
toast.error(response.message || "S3 connection test failed. Please verify your credentials and settings.");
}
},
onError: (error: Error & { response?: { data?: { detail?: string } } }) => {
console.error("Failed to test S3 connection:", error);
const message =
error?.response?.data?.detail ||
"S3 connection test failed. Please verify your credentials and settings.";
toast.error(message);
},
});
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
updateMutation.mutate(config);
};
const handleTest = () => {
testMutation.mutate();
};
const isConfigValid =
config.access_key_id.trim().length > 0 &&
config.secret_access_key.trim().length > 0 &&
config.bucket_name.trim().length > 0;
return (
<Drawer open={open} onOpenChange={setOpen}>
<DrawerTrigger asChild>{trigger || <EditButton />}</DrawerTrigger>
<DrawerContent>
<div className="mx-auto w-full max-w-sm">
<DrawerHeader>
<DrawerTitle className="flex items-center space-x-2">
<Cloud className="h-5 w-5 text-primary" />
<span>S3 Backup Configuration</span>
</DrawerTitle>
<DrawerDescription>
Configure S3 settings for automatic database backups
</DrawerDescription>
</DrawerHeader>
<form onSubmit={handleSubmit} className="px-4 space-y-4">
<div className="flex items-center space-x-2">
<Switch
id="enabled"
checked={config.enabled}
onCheckedChange={(checked) =>
setConfig({ ...config, enabled: checked })
}
/>
<Label htmlFor="enabled">Enable S3 backups</Label>
</div>
{config.enabled && (
<>
<div className="space-y-2">
<Label htmlFor="access_key_id">Access Key ID</Label>
<Input
id="access_key_id"
type="text"
value={config.access_key_id}
onChange={(e) =>
setConfig({ ...config, access_key_id: e.target.value })
}
placeholder="Your AWS Access Key ID"
required
/>
</div>
<div className="space-y-2">
<Label htmlFor="secret_access_key">Secret Access Key</Label>
<Input
id="secret_access_key"
type="password"
value={config.secret_access_key}
onChange={(e) =>
setConfig({
...config,
secret_access_key: e.target.value,
})
}
placeholder="Your AWS Secret Access Key"
required
/>
</div>
<div className="space-y-2">
<Label htmlFor="bucket_name">Bucket Name</Label>
<Input
id="bucket_name"
type="text"
value={config.bucket_name}
onChange={(e) =>
setConfig({ ...config, bucket_name: e.target.value })
}
placeholder="my-backup-bucket"
required
/>
</div>
<div className="space-y-2">
<Label htmlFor="region">Region</Label>
<Input
id="region"
type="text"
value={config.region}
onChange={(e) =>
setConfig({ ...config, region: e.target.value })
}
placeholder="us-east-1"
required
/>
</div>
<div className="space-y-2">
<Label htmlFor="endpoint_url">
Custom Endpoint URL (Optional)
</Label>
<Input
id="endpoint_url"
type="url"
value={config.endpoint_url || ""}
onChange={(e) =>
setConfig({ ...config, endpoint_url: e.target.value })
}
placeholder="https://custom-s3-endpoint.com"
/>
<p className="text-xs text-muted-foreground">
For S3-compatible services like MinIO or DigitalOcean Spaces
</p>
</div>
<div className="flex items-center space-x-2">
<Switch
id="path_style"
checked={config.path_style}
onCheckedChange={(checked) =>
setConfig({ ...config, path_style: checked })
}
/>
<Label htmlFor="path_style">Use path-style addressing</Label>
</div>
<p className="text-xs text-muted-foreground">
Enable for older S3 implementations or certain S3-compatible
services
</p>
</>
)}
<DrawerFooter className="px-0">
<div className="flex space-x-2">
<Button
type="submit"
disabled={updateMutation.isPending || !config.enabled}
>
{updateMutation.isPending
? "Saving..."
: "Save Configuration"}
</Button>
{config.enabled && isConfigValid && (
<Button
type="button"
variant="outline"
onClick={handleTest}
disabled={testMutation.isPending}
>
{testMutation.isPending ? (
<>
<TestTube className="h-4 w-4 mr-2 animate-spin" />
Testing...
</>
) : (
<>
<TestTube className="h-4 w-4 mr-2" />
Test
</>
)}
</Button>
)}
</div>
<DrawerClose asChild>
<Button variant="ghost">Cancel</Button>
</DrawerClose>
</DrawerFooter>
</form>
</div>
</DrawerContent>
</Drawer>
);
}

View File

@@ -16,7 +16,11 @@ import {
Trash2,
User,
Filter,
Cloud,
Archive,
Eye,
} from "lucide-react";
import { toast } from "sonner";
import { apiClient } from "../lib/api";
import { formatCurrency, formatDate } from "../lib/utils";
import {
@@ -35,11 +39,14 @@ import NotificationFiltersDrawer from "./NotificationFiltersDrawer";
import DiscordConfigDrawer from "./DiscordConfigDrawer";
import TelegramConfigDrawer from "./TelegramConfigDrawer";
import AddBankAccountDrawer from "./AddBankAccountDrawer";
import S3BackupConfigDrawer from "./S3BackupConfigDrawer";
import type {
Account,
Balance,
NotificationSettings,
NotificationService,
BackupSettings,
BackupInfo,
} from "../types/api";
// Helper function to get status indicator color and styles
@@ -80,6 +87,7 @@ export default function Settings() {
const [editingAccountId, setEditingAccountId] = useState<string | null>(null);
const [editingName, setEditingName] = useState("");
const [failedImages, setFailedImages] = useState<Set<string>>(new Set());
const [showBackups, setShowBackups] = useState(false);
const queryClient = useQueryClient();
@@ -125,6 +133,28 @@ export default function Settings() {
queryFn: apiClient.getBankConnectionsStatus,
});
// Backup queries
const {
data: backupSettings,
isLoading: backupLoading,
error: backupError,
refetch: refetchBackup,
} = useQuery<BackupSettings>({
queryKey: ["backupSettings"],
queryFn: apiClient.getBackupSettings,
});
const {
data: backups,
isLoading: backupsLoading,
error: backupsError,
refetch: refetchBackups,
} = useQuery<BackupInfo[]>({
queryKey: ["backups"],
queryFn: apiClient.listBackups,
enabled: showBackups,
});
// Account mutations
const updateAccountMutation = useMutation({
mutationFn: ({ id, display_name }: { id: string; display_name: string }) =>
@@ -158,6 +188,26 @@ export default function Settings() {
},
});
// Backup mutations
const createBackupMutation = useMutation({
mutationFn: () => apiClient.performBackupOperation({ operation: "backup" }),
onSuccess: (response) => {
if (response.success) {
toast.success(response.message || "Backup created successfully!");
queryClient.invalidateQueries({ queryKey: ["backups"] });
} else {
toast.error(response.message || "Failed to create backup.");
}
},
onError: (error: Error & { response?: { data?: { detail?: string } } }) => {
console.error("Failed to create backup:", error);
const message =
error?.response?.data?.detail ||
"Failed to create backup. Please check your S3 configuration.";
toast.error(message);
},
});
// Account handlers
const handleEditStart = (account: Account) => {
setEditingAccountId(account.id);
@@ -189,8 +239,27 @@ export default function Settings() {
}
};
const isLoading = accountsLoading || settingsLoading || servicesLoading;
const hasError = accountsError || settingsError || servicesError;
// Backup handlers
const handleCreateBackup = () => {
if (!backupSettings?.s3?.enabled) {
toast.error("S3 backup is not enabled. Please configure and enable S3 backup first.");
return;
}
createBackupMutation.mutate();
};
const handleViewBackups = () => {
if (!backupSettings?.s3?.enabled) {
toast.error("S3 backup is not enabled. Please configure and enable S3 backup first.");
return;
}
setShowBackups(true);
};
const isLoading =
accountsLoading || settingsLoading || servicesLoading || backupLoading;
const hasError =
accountsError || settingsError || servicesError || backupError;
if (isLoading) {
return <AccountsSkeleton />;
@@ -211,6 +280,7 @@ export default function Settings() {
refetchAccounts();
refetchSettings();
refetchServices();
refetchBackup();
}}
variant="outline"
size="sm"
@@ -226,7 +296,7 @@ export default function Settings() {
return (
<div className="space-y-6">
<Tabs defaultValue="accounts" className="space-y-6">
<TabsList className="grid w-full grid-cols-2">
<TabsList className="grid w-full grid-cols-3">
<TabsTrigger value="accounts" className="flex items-center space-x-2">
<User className="h-4 w-4" />
<span>Accounts</span>
@@ -238,6 +308,10 @@ export default function Settings() {
<Bell className="h-4 w-4" />
<span>Notifications</span>
</TabsTrigger>
<TabsTrigger value="backup" className="flex items-center space-x-2">
<Cloud className="h-4 w-4" />
<span>Backup</span>
</TabsTrigger>
</TabsList>
<TabsContent value="accounts" className="space-y-6">
@@ -728,6 +802,174 @@ export default function Settings() {
</CardContent>
</Card>
</TabsContent>
<TabsContent value="backup" className="space-y-6">
{/* S3 Backup Configuration */}
<Card>
<CardHeader>
<CardTitle className="flex items-center space-x-2">
<Cloud className="h-5 w-5 text-primary" />
<span>S3 Backup Configuration</span>
</CardTitle>
<CardDescription>
Configure automatic database backups to Amazon S3 or
S3-compatible storage
</CardDescription>
</CardHeader>
<CardContent>
{!backupSettings?.s3 ? (
<div className="text-center py-8">
<Cloud className="h-12 w-12 text-muted-foreground mx-auto mb-4" />
<h3 className="text-lg font-medium text-foreground mb-2">
No S3 backup configured
</h3>
<p className="text-muted-foreground mb-4">
Set up S3 backup to automatically backup your database to
the cloud.
</p>
<S3BackupConfigDrawer settings={backupSettings} />
</div>
) : (
<div className="space-y-4">
<div className="flex items-center justify-between p-4 border rounded-lg">
<div className="flex items-center space-x-4">
<div className="p-3 bg-muted rounded-full">
<Cloud className="h-6 w-6 text-muted-foreground" />
</div>
<div>
<div className="flex items-center space-x-3">
<h4 className="text-lg font-medium text-foreground">
S3 Backup
</h4>
<div className="flex items-center space-x-2">
<div
className={`w-2 h-2 rounded-full ${
backupSettings.s3.enabled
? "bg-green-500"
: "bg-muted-foreground"
}`}
/>
<span className="text-sm text-muted-foreground">
{backupSettings.s3.enabled
? "Enabled"
: "Disabled"}
</span>
</div>
</div>
<div className="mt-2 space-y-1">
<p className="text-sm text-muted-foreground">
<span className="font-medium">Bucket:</span>{" "}
{backupSettings.s3.bucket_name}
</p>
<p className="text-sm text-muted-foreground">
<span className="font-medium">Region:</span>{" "}
{backupSettings.s3.region}
</p>
{backupSettings.s3.endpoint_url && (
<p className="text-sm text-muted-foreground">
<span className="font-medium">Endpoint:</span>{" "}
{backupSettings.s3.endpoint_url}
</p>
)}
</div>
</div>
</div>
<S3BackupConfigDrawer settings={backupSettings} />
</div>
<div className="p-4 bg-muted rounded-lg">
<h5 className="font-medium mb-2">Backup Information</h5>
<p className="text-sm text-muted-foreground mb-3">
Database backups are stored in the "leggen_backups/"
folder in your S3 bucket. Backups include the complete
SQLite database file.
</p>
<div className="flex space-x-2">
<Button
size="sm"
variant="outline"
onClick={handleCreateBackup}
disabled={createBackupMutation.isPending}
>
{createBackupMutation.isPending ? (
<>
<Archive className="h-4 w-4 mr-2 animate-spin" />
Creating...
</>
) : (
<>
<Archive className="h-4 w-4 mr-2" />
Create Backup Now
</>
)}
</Button>
<Button
size="sm"
variant="outline"
onClick={handleViewBackups}
>
<Eye className="h-4 w-4 mr-2" />
View Backups
</Button>
</div>
</div>
{/* Backup List Modal/View */}
{showBackups && (
<div className="mt-6 p-4 border rounded-lg bg-background">
<div className="flex items-center justify-between mb-4">
<h5 className="font-medium">Available Backups</h5>
<Button
size="sm"
variant="ghost"
onClick={() => setShowBackups(false)}
>
<X className="h-4 w-4" />
</Button>
</div>
{backupsLoading ? (
<p className="text-sm text-muted-foreground">Loading backups...</p>
) : backupsError ? (
<div className="space-y-2">
<p className="text-sm text-destructive">Failed to load backups</p>
<Button
size="sm"
variant="outline"
onClick={() => refetchBackups()}
>
<RefreshCw className="h-4 w-4 mr-2" />
Retry
</Button>
</div>
) : !backups || backups.length === 0 ? (
<p className="text-sm text-muted-foreground">No backups found</p>
) : (
<div className="space-y-2">
{backups.map((backup, index) => (
<div
key={backup.key || index}
className="flex items-center justify-between p-3 border rounded bg-muted/50"
>
<div>
<p className="text-sm font-medium">{backup.key}</p>
<div className="flex items-center space-x-4 text-xs text-muted-foreground mt-1">
<span>Modified: {formatDate(backup.last_modified)}</span>
<span>Size: {(backup.size / 1024 / 1024).toFixed(2)} MB</span>
</div>
</div>
</div>
))}
</div>
)}
</div>
)}
</div>
)}
</CardContent>
</Card>
</TabsContent>
</Tabs>
</div>
);

View File

@@ -17,6 +17,10 @@ import type {
BankConnectionStatus,
BankRequisition,
Country,
BackupSettings,
BackupTest,
BackupInfo,
BackupOperation,
} from "../types/api";
// Use VITE_API_URL for development, relative URLs for production
@@ -274,6 +278,38 @@ export const apiClient = {
const response = await api.get<ApiResponse<Country[]>>("/banks/countries");
return response.data.data;
},
// Backup endpoints
getBackupSettings: async (): Promise<BackupSettings> => {
const response =
await api.get<ApiResponse<BackupSettings>>("/backup/settings");
return response.data.data;
},
updateBackupSettings: async (
settings: BackupSettings,
): Promise<BackupSettings> => {
const response = await api.put<ApiResponse<BackupSettings>>(
"/backup/settings",
settings,
);
return response.data.data;
},
testBackupConnection: async (test: BackupTest): Promise<ApiResponse<{ connected?: boolean }>> => {
const response = await api.post<ApiResponse<{ connected?: boolean }>>("/backup/test", test);
return response.data;
},
listBackups: async (): Promise<BackupInfo[]> => {
const response = await api.get<ApiResponse<BackupInfo[]>>("/backup/list");
return response.data.data;
},
performBackupOperation: async (operation: BackupOperation): Promise<ApiResponse<{ operation: string; completed: boolean }>> => {
const response = await api.post<ApiResponse<{ operation: string; completed: boolean }>>("/backup/operation", operation);
return response.data;
},
};
export default apiClient;

View File

@@ -5,6 +5,7 @@ import { PWAInstallPrompt, PWAUpdatePrompt } from "../components/PWAPrompts";
import { usePWA } from "../hooks/usePWA";
import { useVersionCheck } from "../hooks/useVersionCheck";
import { SidebarInset, SidebarProvider } from "../components/ui/sidebar";
import { Toaster } from "../components/ui/sonner";
function RootLayout() {
const { updateAvailable, updateSW, forceReload } = usePWA();
@@ -48,6 +49,9 @@ function RootLayout() {
updateAvailable={updateAvailable}
onUpdate={handlePWAUpdate}
/>
{/* Toast Notifications */}
<Toaster />
</SidebarProvider>
);
}

View File

@@ -277,3 +277,34 @@ export interface Country {
code: string;
name: string;
}
// Backup types
export interface S3Config {
access_key_id: string;
secret_access_key: string;
bucket_name: string;
region: string;
endpoint_url?: string;
path_style: boolean;
enabled: boolean;
}
export interface BackupSettings {
s3?: S3Config;
}
export interface BackupTest {
service: string;
config: S3Config;
}
export interface BackupInfo {
key: string;
last_modified: string;
size: number;
}
export interface BackupOperation {
operation: string;
backup_key?: string;
}

View File

@@ -0,0 +1,49 @@
"""API models for backup endpoints."""
from typing import Optional
from pydantic import BaseModel, Field
class S3Config(BaseModel):
"""S3 backup configuration model for API."""
access_key_id: str = Field(..., description="AWS S3 access key ID")
secret_access_key: str = Field(..., description="AWS S3 secret access key")
bucket_name: str = Field(..., description="S3 bucket name")
region: str = Field(default="us-east-1", description="AWS S3 region")
endpoint_url: Optional[str] = Field(
default=None, description="Custom S3 endpoint URL"
)
path_style: bool = Field(default=False, description="Use path-style addressing")
enabled: bool = Field(default=True, description="Enable S3 backups")
class BackupSettings(BaseModel):
"""Backup settings model for API."""
s3: Optional[S3Config] = None
class BackupTest(BaseModel):
"""Backup connection test request model."""
service: str = Field(..., description="Backup service type (s3)")
config: S3Config = Field(..., description="S3 configuration to test")
class BackupInfo(BaseModel):
"""Backup file information model."""
key: str = Field(..., description="S3 object key")
last_modified: str = Field(..., description="Last modified timestamp (ISO format)")
size: int = Field(..., description="File size in bytes")
class BackupOperation(BaseModel):
"""Backup operation request model."""
operation: str = Field(..., description="Operation type (backup, restore)")
backup_key: Optional[str] = Field(
default=None, description="Backup key for restore operations"
)

264
leggen/api/routes/backup.py Normal file
View File

@@ -0,0 +1,264 @@
"""API routes for backup management."""
from fastapi import APIRouter, HTTPException
from loguru import logger
from leggen.api.models.backup import (
BackupOperation,
BackupSettings,
BackupTest,
S3Config,
)
from leggen.api.models.common import APIResponse
from leggen.models.config import S3BackupConfig
from leggen.services.backup_service import BackupService
from leggen.utils.config import config
from leggen.utils.paths import path_manager
router = APIRouter()
@router.get("/backup/settings", response_model=APIResponse)
async def get_backup_settings() -> APIResponse:
"""Get current backup settings."""
try:
backup_config = config.backup_config
# Build response safely without exposing secrets
s3_config = backup_config.get("s3", {})
settings = BackupSettings(
s3=S3Config(
access_key_id="***" if s3_config.get("access_key_id") else "",
secret_access_key="***" if s3_config.get("secret_access_key") else "",
bucket_name=s3_config.get("bucket_name", ""),
region=s3_config.get("region", "us-east-1"),
endpoint_url=s3_config.get("endpoint_url"),
path_style=s3_config.get("path_style", False),
enabled=s3_config.get("enabled", True),
)
if s3_config.get("bucket_name")
else None,
)
return APIResponse(
success=True,
data=settings,
message="Backup settings retrieved successfully",
)
except Exception as e:
logger.error(f"Failed to get backup settings: {e}")
raise HTTPException(
status_code=500, detail=f"Failed to get backup settings: {str(e)}"
) from e
@router.put("/backup/settings", response_model=APIResponse)
async def update_backup_settings(settings: BackupSettings) -> APIResponse:
"""Update backup settings."""
try:
# First test the connection if S3 config is provided
if settings.s3:
# Convert API model to config model
s3_config = S3BackupConfig(
access_key_id=settings.s3.access_key_id,
secret_access_key=settings.s3.secret_access_key,
bucket_name=settings.s3.bucket_name,
region=settings.s3.region,
endpoint_url=settings.s3.endpoint_url,
path_style=settings.s3.path_style,
enabled=settings.s3.enabled,
)
# Test connection
backup_service = BackupService()
connection_success = await backup_service.test_connection(s3_config)
if not connection_success:
raise HTTPException(
status_code=400,
detail="S3 connection test failed. Please check your configuration.",
)
# Update backup config
backup_config = {}
if settings.s3:
backup_config["s3"] = {
"access_key_id": settings.s3.access_key_id,
"secret_access_key": settings.s3.secret_access_key,
"bucket_name": settings.s3.bucket_name,
"region": settings.s3.region,
"endpoint_url": settings.s3.endpoint_url,
"path_style": settings.s3.path_style,
"enabled": settings.s3.enabled,
}
# Save to config
if backup_config:
config.update_section("backup", backup_config)
return APIResponse(
success=True,
data={"updated": True},
message="Backup settings updated successfully",
)
except HTTPException:
raise
except Exception as e:
logger.error(f"Failed to update backup settings: {e}")
raise HTTPException(
status_code=500, detail=f"Failed to update backup settings: {str(e)}"
) from e
@router.post("/backup/test", response_model=APIResponse)
async def test_backup_connection(test_request: BackupTest) -> APIResponse:
"""Test backup connection."""
try:
if test_request.service != "s3":
raise HTTPException(
status_code=400, detail="Only 's3' service is supported"
)
# Convert API model to config model
s3_config = S3BackupConfig(
access_key_id=test_request.config.access_key_id,
secret_access_key=test_request.config.secret_access_key,
bucket_name=test_request.config.bucket_name,
region=test_request.config.region,
endpoint_url=test_request.config.endpoint_url,
path_style=test_request.config.path_style,
enabled=test_request.config.enabled,
)
backup_service = BackupService()
success = await backup_service.test_connection(s3_config)
if success:
return APIResponse(
success=True,
data={"connected": True},
message="S3 connection test successful",
)
else:
return APIResponse(
success=False,
message="S3 connection test failed",
)
except HTTPException:
raise
except Exception as e:
logger.error(f"Failed to test backup connection: {e}")
raise HTTPException(
status_code=500, detail=f"Failed to test backup connection: {str(e)}"
) from e
@router.get("/backup/list", response_model=APIResponse)
async def list_backups() -> APIResponse:
"""List available backups."""
try:
backup_config = config.backup_config.get("s3", {})
if not backup_config.get("bucket_name"):
return APIResponse(
success=True,
data=[],
message="No S3 backup configuration found",
)
# Convert config to model
s3_config = S3BackupConfig(**backup_config)
backup_service = BackupService(s3_config)
backups = await backup_service.list_backups()
return APIResponse(
success=True,
data=backups,
message=f"Found {len(backups)} backups",
)
except Exception as e:
logger.error(f"Failed to list backups: {e}")
raise HTTPException(
status_code=500, detail=f"Failed to list backups: {str(e)}"
) from e
@router.post("/backup/operation", response_model=APIResponse)
async def backup_operation(operation_request: BackupOperation) -> APIResponse:
"""Perform backup operation (backup or restore)."""
try:
backup_config = config.backup_config.get("s3", {})
if not backup_config.get("bucket_name"):
raise HTTPException(status_code=400, detail="S3 backup is not configured")
# Convert config to model with validation
try:
s3_config = S3BackupConfig(**backup_config)
except Exception as e:
raise HTTPException(
status_code=400, detail=f"Invalid S3 configuration: {str(e)}"
) from e
backup_service = BackupService(s3_config)
if operation_request.operation == "backup":
# Backup database
database_path = path_manager.get_database_path()
success = await backup_service.backup_database(database_path)
if success:
return APIResponse(
success=True,
data={"operation": "backup", "completed": True},
message="Database backup completed successfully",
)
else:
return APIResponse(
success=False,
message="Database backup failed",
)
elif operation_request.operation == "restore":
if not operation_request.backup_key:
raise HTTPException(
status_code=400,
detail="backup_key is required for restore operation",
)
# Restore database
database_path = path_manager.get_database_path()
success = await backup_service.restore_database(
operation_request.backup_key, database_path
)
if success:
return APIResponse(
success=True,
data={"operation": "restore", "completed": True},
message="Database restore completed successfully",
)
else:
return APIResponse(
success=False,
message="Database restore failed",
)
else:
raise HTTPException(
status_code=400, detail="Invalid operation. Use 'backup' or 'restore'"
)
except HTTPException:
raise
except Exception as e:
logger.error(f"Failed to perform backup operation: {e}")
raise HTTPException(
status_code=500, detail=f"Failed to perform backup operation: {str(e)}"
) from e

View File

@@ -2,7 +2,6 @@ import click
from leggen.api_client import LeggenAPIClient
from leggen.main import cli
from leggen.utils.disk import save_file
from leggen.utils.text import info, print_table, success, warning
@@ -63,9 +62,6 @@ def add(ctx):
# Connect to bank via API
result = api_client.connect_to_bank(bank_id, "http://localhost:8000/")
# Save requisition details
save_file(f"req_{result['id']}.json", result)
success("Bank connection request created successfully!")
warning(
"Please open the following URL in your browser to complete the authorization:"

View File

@@ -7,7 +7,7 @@ from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from loguru import logger
from leggen.api.routes import accounts, banks, notifications, sync, transactions
from leggen.api.routes import accounts, backup, banks, notifications, sync, transactions
from leggen.background.scheduler import scheduler
from leggen.utils.config import config
from leggen.utils.paths import path_manager
@@ -81,6 +81,7 @@ def create_app() -> FastAPI:
app.include_router(transactions.router, prefix="/api/v1", tags=["transactions"])
app.include_router(sync.router, prefix="/api/v1", tags=["sync"])
app.include_router(notifications.router, prefix="/api/v1", tags=["notifications"])
app.include_router(backup.router, prefix="/api/v1", tags=["backup"])
@app.get("/api/v1/health")
async def health():

View File

@@ -32,6 +32,22 @@ class NotificationConfig(BaseModel):
telegram: Optional[TelegramNotificationConfig] = None
class S3BackupConfig(BaseModel):
access_key_id: str = Field(..., description="AWS S3 access key ID")
secret_access_key: str = Field(..., description="AWS S3 secret access key")
bucket_name: str = Field(..., description="S3 bucket name")
region: str = Field(default="us-east-1", description="AWS S3 region")
endpoint_url: Optional[str] = Field(
default=None, description="Custom S3 endpoint URL"
)
path_style: bool = Field(default=False, description="Use path-style addressing")
enabled: bool = Field(default=True, description="Enable S3 backups")
class BackupConfig(BaseModel):
s3: Optional[S3BackupConfig] = None
class FilterConfig(BaseModel):
case_insensitive: Optional[List[str]] = Field(default_factory=list)
case_sensitive: Optional[List[str]] = Field(default_factory=list)
@@ -56,3 +72,4 @@ class Config(BaseModel):
notifications: Optional[NotificationConfig] = None
filters: Optional[FilterConfig] = None
scheduler: SchedulerConfig = Field(default_factory=SchedulerConfig)
backup: Optional[BackupConfig] = None

93
leggen/models/database.py Normal file
View File

@@ -0,0 +1,93 @@
"""SQLModel database models for Leggen."""
from datetime import datetime
from typing import Optional
from sqlmodel import JSON, Column, Field, SQLModel
class Account(SQLModel, table=True):
"""Account model."""
__tablename__ = "accounts"
id: str = Field(primary_key=True)
institution_id: str = Field(index=True)
status: str = Field(index=True)
iban: Optional[str] = None
name: Optional[str] = None
currency: Optional[str] = None
created: datetime
last_accessed: Optional[datetime] = None
last_updated: Optional[datetime] = None
display_name: Optional[str] = None
logo: Optional[str] = None
class Balance(SQLModel, table=True):
"""Balance model."""
__tablename__ = "balances"
id: Optional[int] = Field(default=None, primary_key=True)
account_id: str = Field(index=True)
bank: str
status: str
iban: str
amount: float
currency: str
type: str
timestamp: datetime = Field(index=True)
class Transaction(SQLModel, table=True):
"""Transaction model."""
__tablename__ = "transactions"
accountId: str = Field(primary_key=True)
transactionId: str = Field(primary_key=True)
internalTransactionId: Optional[str] = Field(default=None, index=True)
institutionId: str
iban: Optional[str] = None
transactionDate: Optional[datetime] = Field(default=None, index=True)
description: Optional[str] = None
transactionValue: Optional[float] = Field(default=None, index=True)
transactionCurrency: Optional[str] = None
transactionStatus: Optional[str] = None
rawTransaction: dict = Field(sa_column=Column(JSON))
class TransactionEnrichment(SQLModel, table=True):
"""Transaction enrichment model."""
__tablename__ = "transaction_enrichments"
accountId: str = Field(primary_key=True, foreign_key="transactions.accountId")
transactionId: str = Field(
primary_key=True, foreign_key="transactions.transactionId"
)
clean_name: Optional[str] = Field(default=None, index=True)
category: Optional[str] = Field(default=None, index=True)
logo_url: Optional[str] = None
created_at: datetime
updated_at: datetime
class SyncOperation(SQLModel, table=True):
"""Sync operation model."""
__tablename__ = "sync_operations"
id: Optional[int] = Field(default=None, primary_key=True)
started_at: datetime = Field(index=True)
completed_at: Optional[datetime] = None
success: Optional[bool] = Field(default=None, index=True)
accounts_processed: int = Field(default=0)
transactions_added: int = Field(default=0)
transactions_updated: int = Field(default=0)
balances_updated: int = Field(default=0)
duration_seconds: Optional[float] = None
errors: Optional[str] = None
logs: Optional[str] = None
trigger_type: str = Field(default="manual", index=True)

View File

@@ -0,0 +1,192 @@
"""Backup service for S3 storage."""
import tempfile
from datetime import datetime
from pathlib import Path
from typing import Optional
import boto3
from botocore.exceptions import ClientError, NoCredentialsError
from loguru import logger
from leggen.models.config import S3BackupConfig
class BackupService:
"""Service for managing S3 backups."""
def __init__(self, s3_config: Optional[S3BackupConfig] = None):
"""Initialize backup service with S3 configuration."""
self.s3_config = s3_config
self._s3_client = None
def _get_s3_client(self, config: Optional[S3BackupConfig] = None):
"""Get or create S3 client with current configuration."""
current_config = config or self.s3_config
if not current_config:
raise ValueError("S3 configuration is required")
# Create S3 client with configuration
session = boto3.Session(
aws_access_key_id=current_config.access_key_id,
aws_secret_access_key=current_config.secret_access_key,
region_name=current_config.region,
)
s3_kwargs = {}
if current_config.endpoint_url:
s3_kwargs["endpoint_url"] = current_config.endpoint_url
if current_config.path_style:
from botocore.config import Config
s3_kwargs["config"] = Config(s3={"addressing_style": "path"})
return session.client("s3", **s3_kwargs)
async def test_connection(self, config: S3BackupConfig) -> bool:
"""Test S3 connection with provided configuration.
Args:
config: S3 configuration to test
Returns:
True if connection successful, False otherwise
"""
try:
s3_client = self._get_s3_client(config)
# Try to list objects in the bucket (limited to 1 to minimize cost)
s3_client.list_objects_v2(Bucket=config.bucket_name, MaxKeys=1)
logger.info(
f"S3 connection test successful for bucket: {config.bucket_name}"
)
return True
except NoCredentialsError:
logger.error("S3 credentials not found or invalid")
return False
except ClientError as e:
error_code = e.response["Error"]["Code"]
logger.error(
f"S3 connection test failed: {error_code} - {e.response['Error']['Message']}"
)
return False
except Exception as e:
logger.error(f"Unexpected error during S3 connection test: {str(e)}")
return False
async def backup_database(self, database_path: Path) -> bool:
"""Backup database file to S3.
Args:
database_path: Path to the SQLite database file
Returns:
True if backup successful, False otherwise
"""
if not self.s3_config or not self.s3_config.enabled:
logger.warning("S3 backup is not configured or disabled")
return False
if not database_path.exists():
logger.error(f"Database file not found: {database_path}")
return False
try:
s3_client = self._get_s3_client()
# Generate backup filename with timestamp
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
backup_key = f"leggen_backups/database_backup_{timestamp}.db"
# Upload database file
logger.info(f"Starting database backup to S3: {backup_key}")
s3_client.upload_file(
str(database_path), self.s3_config.bucket_name, backup_key
)
logger.info(f"Database backup completed successfully: {backup_key}")
return True
except Exception as e:
logger.error(f"Database backup failed: {str(e)}")
return False
async def list_backups(self) -> list[dict]:
"""List available backups in S3.
Returns:
List of backup metadata dictionaries
"""
if not self.s3_config or not self.s3_config.enabled:
logger.warning("S3 backup is not configured or disabled")
return []
try:
s3_client = self._get_s3_client()
# List objects with backup prefix
response = s3_client.list_objects_v2(
Bucket=self.s3_config.bucket_name, Prefix="leggen_backups/"
)
backups = []
for obj in response.get("Contents", []):
backups.append(
{
"key": obj["Key"],
"last_modified": obj["LastModified"].isoformat(),
"size": obj["Size"],
}
)
# Sort by last modified (newest first)
backups.sort(key=lambda x: x["last_modified"], reverse=True)
return backups
except Exception as e:
logger.error(f"Failed to list backups: {str(e)}")
return []
async def restore_database(self, backup_key: str, restore_path: Path) -> bool:
"""Restore database from S3 backup.
Args:
backup_key: S3 key of the backup to restore
restore_path: Path where to restore the database
Returns:
True if restore successful, False otherwise
"""
if not self.s3_config or not self.s3_config.enabled:
logger.warning("S3 backup is not configured or disabled")
return False
try:
s3_client = self._get_s3_client()
# Download backup file
logger.info(f"Starting database restore from S3: {backup_key}")
# Create parent directory if it doesn't exist
restore_path.parent.mkdir(parents=True, exist_ok=True)
# Download to temporary file first, then move to final location
with tempfile.NamedTemporaryFile(delete=False) as temp_file:
s3_client.download_file(
self.s3_config.bucket_name, backup_key, temp_file.name
)
# Move temp file to final location
temp_path = Path(temp_file.name)
temp_path.replace(restore_path)
logger.info(f"Database restore completed successfully: {restore_path}")
return True
except Exception as e:
logger.error(f"Database restore failed: {str(e)}")
return False

View File

@@ -0,0 +1,65 @@
"""Database connection and session management using SQLModel."""
from contextlib import contextmanager
from typing import Generator
from loguru import logger
from sqlalchemy import create_engine
from sqlalchemy.pool import StaticPool
from sqlmodel import Session, SQLModel
from leggen.utils.paths import path_manager
_engine = None
def get_database_url() -> str:
"""Get the database URL for SQLAlchemy."""
db_path = path_manager.get_database_path()
return f"sqlite:///{db_path}"
def get_engine():
"""Get or create the database engine."""
global _engine
if _engine is None:
db_url = get_database_url()
_engine = create_engine(
db_url,
connect_args={"check_same_thread": False},
poolclass=StaticPool,
echo=False,
)
return _engine
def create_db_and_tables():
"""Create all database tables."""
engine = get_engine()
SQLModel.metadata.create_all(engine)
logger.info("Database tables created/verified")
@contextmanager
def get_session() -> Generator[Session, None, None]:
"""Get a database session context manager.
Usage:
with get_session() as session:
result = session.exec(select(Account)).all()
"""
session = Session(get_engine())
try:
yield session
session.commit()
except Exception as e:
session.rollback()
logger.error(f"Database session error: {e}")
raise
finally:
session.close()
def init_database():
"""Initialize the database with tables."""
create_db_and_tables()

View File

@@ -0,0 +1,64 @@
"""Database helper utilities for Leggen - Compatibility layer."""
import sqlite3
from contextlib import contextmanager
from pathlib import Path
from typing import Any, Generator
from loguru import logger
@contextmanager
def get_db_connection(db_path: Path) -> Generator[sqlite3.Connection, None, None]:
"""Context manager for database connections.
Usage:
with get_db_connection(db_path) as conn:
cursor = conn.cursor()
cursor.execute(...)
conn.commit()
"""
conn = None
try:
conn = sqlite3.connect(str(db_path))
conn.row_factory = sqlite3.Row # Enable dict-like access
yield conn
except Exception as e:
if conn:
conn.rollback()
logger.error(f"Database error: {e}")
raise
finally:
if conn:
conn.close()
def execute_query(
db_path: Path, query: str, params: tuple = ()
) -> list[dict[str, Any]]:
"""Execute a SELECT query and return results as list of dicts."""
with get_db_connection(db_path) as conn:
cursor = conn.cursor()
cursor.execute(query, params)
rows = cursor.fetchall()
return [dict(row) for row in rows]
def execute_single(
db_path: Path, query: str, params: tuple = ()
) -> dict[str, Any] | None:
"""Execute a SELECT query and return a single result as dict or None."""
with get_db_connection(db_path) as conn:
cursor = conn.cursor()
cursor.execute(query, params)
row = cursor.fetchone()
return dict(row) if row else None
def execute_count(db_path: Path, query: str, params: tuple = ()) -> int:
"""Execute a COUNT query and return the integer result."""
with get_db_connection(db_path) as conn:
cursor = conn.cursor()
cursor.execute(query, params)
result = cursor.fetchone()
return result[0] if result else 0

File diff suppressed because it is too large Load Diff

View File

@@ -9,16 +9,21 @@ from leggen.utils.config import config
from leggen.utils.paths import path_manager
def _log_rate_limits(response):
def _log_rate_limits(response, method, url):
"""Log GoCardless API rate limit headers"""
limit = response.headers.get("http_x_ratelimit_limit")
remaining = response.headers.get("http_x_ratelimit_remaining")
reset = response.headers.get("http_x_ratelimit_reset")
if limit or remaining or reset:
logger.info(
f"GoCardless rate limits - Limit: {limit}, Remaining: {remaining}, Reset: {reset}s"
)
account_limit = response.headers.get("http_x_ratelimit_account_success_limit")
account_remaining = response.headers.get(
"http_x_ratelimit_account_success_remaining"
)
account_reset = response.headers.get("http_x_ratelimit_account_success_reset")
logger.debug(
f"{method} {url} Limit/Remaining/Reset (Global: {limit}/{remaining}/{reset}s) (Account: {account_limit}/{account_remaining}/{account_reset}s)"
)
class GoCardlessService:
@@ -37,7 +42,7 @@ class GoCardlessService:
async with httpx.AsyncClient() as client:
response = await client.request(method, url, headers=headers, **kwargs)
_log_rate_limits(response)
_log_rate_limits(response, method, url)
# If we get 401, clear token cache and retry once
if response.status_code == 401:
@@ -45,7 +50,7 @@ class GoCardlessService:
self._token = None
headers = await self._get_auth_headers()
response = await client.request(method, url, headers=headers, **kwargs)
_log_rate_limits(response)
_log_rate_limits(response, method, url)
response.raise_for_status()
return response.json()
@@ -76,7 +81,9 @@ class GoCardlessService:
f"{self.base_url}/token/refresh/",
json={"refresh": auth["refresh"]},
)
_log_rate_limits(response)
_log_rate_limits(
response, "POST", f"{self.base_url}/token/refresh/"
)
response.raise_for_status()
auth.update(response.json())
self._save_auth(auth)
@@ -104,7 +111,7 @@ class GoCardlessService:
"secret_key": self.config["secret"],
},
)
_log_rate_limits(response)
_log_rate_limits(response, "POST", f"{self.base_url}/token/new/")
response.raise_for_status()
auth = response.json()
self._save_auth(auth)

View File

@@ -162,6 +162,11 @@ class Config:
}
return self.config.get("scheduler", default_schedule)
@property
def backup_config(self) -> Dict[str, Any]:
"""Get backup configuration"""
return self.config.get("backup", {})
def load_config(ctx: click.Context, _, filename):
try:

View File

@@ -1,35 +0,0 @@
import json
import sys
from pathlib import Path
import click
from leggen.utils.text import error, info
def save_file(name: str, d: dict):
Path.mkdir(Path(click.get_app_dir("leggen")), exist_ok=True)
config_file = click.get_app_dir("leggen") / Path(name)
with click.open_file(str(config_file), "w") as f:
json.dump(d, f)
info(f"Wrote configuration file at '{config_file}'")
def load_file(name: str) -> dict:
config_file = click.get_app_dir("leggen") / Path(name)
try:
with click.open_file(str(config_file), "r") as f:
config = json.load(f)
return config
except FileNotFoundError:
error(f"Configuration file '{config_file}' not found")
sys.exit(1)
def get_prefixed_files(prefix: str) -> list:
return [
f.name
for f in Path(click.get_app_dir("leggen")).iterdir()
if f.name.startswith(prefix)
]

View File

@@ -1,6 +1,6 @@
[project]
name = "leggen"
version = "2025.9.24"
version = "2025.9.26"
description = "An Open Banking CLI"
authors = [{ name = "Elisiário Couto", email = "elisiario@couto.io" }]
requires-python = "~=3.13.0"
@@ -35,6 +35,9 @@ dependencies = [
"tomli-w>=1.0.0,<2",
"httpx>=0.28.1",
"pydantic>=2.0.0,<3",
"boto3>=1.35.0,<2",
"sqlmodel>=0.0.25",
"alembic>=1.16.5",
]
[project.urls]
@@ -88,5 +91,5 @@ markers = [
]
[[tool.mypy.overrides]]
module = ["apscheduler.*", "discord_webhook.*"]
module = ["apscheduler.*", "discord_webhook.*", "botocore.*", "boto3.*"]
ignore_missing_imports = true

View File

@@ -42,6 +42,9 @@ echo " > Version bumped to $NEXT_VERSION"
echo "Updating CHANGELOG.md"
git-cliff --unreleased --tag "$NEXT_VERSION" --prepend CHANGELOG.md > /dev/null
echo "Locking dependencies"
uv lock
echo " > Commiting changes and adding git tag"
git add pyproject.toml CHANGELOG.md uv.lock
git commit -m "chore(ci): Bump version to $NEXT_VERSION"

View File

@@ -0,0 +1,303 @@
"""Tests for backup API endpoints."""
from unittest.mock import patch
import pytest
@pytest.mark.api
class TestBackupAPI:
"""Test backup-related API endpoints."""
def test_get_backup_settings_no_config(self, api_client, mock_config):
"""Test getting backup settings with no configuration."""
# Mock empty backup config by updating the config dict
mock_config._config["backup"] = {}
with patch("leggen.utils.config.config", mock_config):
response = api_client.get("/api/v1/backup/settings")
assert response.status_code == 200
data = response.json()
assert data["success"] is True
assert data["data"]["s3"] is None
def test_get_backup_settings_with_s3_config(self, api_client, mock_config):
"""Test getting backup settings with S3 configuration."""
# Mock S3 backup config (with masked credentials)
mock_config._config["backup"] = {
"s3": {
"access_key_id": "AKIATEST123",
"secret_access_key": "secret123",
"bucket_name": "test-bucket",
"region": "us-east-1",
"endpoint_url": None,
"path_style": False,
"enabled": True,
}
}
with patch("leggen.utils.config.config", mock_config):
response = api_client.get("/api/v1/backup/settings")
assert response.status_code == 200
data = response.json()
assert data["success"] is True
assert data["data"]["s3"] is not None
s3_config = data["data"]["s3"]
assert s3_config["access_key_id"] == "***" # Masked
assert s3_config["secret_access_key"] == "***" # Masked
assert s3_config["bucket_name"] == "test-bucket"
assert s3_config["region"] == "us-east-1"
assert s3_config["enabled"] is True
@patch("leggen.services.backup_service.BackupService.test_connection")
def test_update_backup_settings_success(
self, mock_test_connection, api_client, mock_config
):
"""Test successful backup settings update."""
mock_test_connection.return_value = True
mock_config._config["backup"] = {}
request_data = {
"s3": {
"access_key_id": "AKIATEST123",
"secret_access_key": "secret123",
"bucket_name": "test-bucket",
"region": "us-east-1",
"endpoint_url": None,
"path_style": False,
"enabled": True,
}
}
with patch("leggen.utils.config.config", mock_config):
response = api_client.put("/api/v1/backup/settings", json=request_data)
assert response.status_code == 200
data = response.json()
assert data["success"] is True
assert data["data"]["updated"] is True
# Verify connection test was called
mock_test_connection.assert_called_once()
@patch("leggen.services.backup_service.BackupService.test_connection")
def test_update_backup_settings_connection_failure(
self, mock_test_connection, api_client, mock_config
):
"""Test backup settings update with connection test failure."""
mock_test_connection.return_value = False
mock_config._config["backup"] = {}
request_data = {
"s3": {
"access_key_id": "AKIATEST123",
"secret_access_key": "secret123",
"bucket_name": "invalid-bucket",
"region": "us-east-1",
"endpoint_url": None,
"path_style": False,
"enabled": True,
}
}
with patch("leggen.utils.config.config", mock_config):
response = api_client.put("/api/v1/backup/settings", json=request_data)
assert response.status_code == 400
data = response.json()
assert "S3 connection test failed" in data["detail"]
@patch("leggen.services.backup_service.BackupService.test_connection")
def test_test_backup_connection_success(self, mock_test_connection, api_client):
"""Test successful backup connection test."""
mock_test_connection.return_value = True
request_data = {
"service": "s3",
"config": {
"access_key_id": "AKIATEST123",
"secret_access_key": "secret123",
"bucket_name": "test-bucket",
"region": "us-east-1",
"endpoint_url": None,
"path_style": False,
"enabled": True,
},
}
response = api_client.post("/api/v1/backup/test", json=request_data)
assert response.status_code == 200
data = response.json()
assert data["success"] is True
assert data["data"]["connected"] is True
# Verify connection test was called
mock_test_connection.assert_called_once()
@patch("leggen.services.backup_service.BackupService.test_connection")
def test_test_backup_connection_failure(self, mock_test_connection, api_client):
"""Test failed backup connection test."""
mock_test_connection.return_value = False
request_data = {
"service": "s3",
"config": {
"access_key_id": "AKIATEST123",
"secret_access_key": "secret123",
"bucket_name": "invalid-bucket",
"region": "us-east-1",
"endpoint_url": None,
"path_style": False,
"enabled": True,
},
}
response = api_client.post("/api/v1/backup/test", json=request_data)
assert response.status_code == 200
data = response.json()
assert data["success"] is False
def test_test_backup_connection_invalid_service(self, api_client):
"""Test backup connection test with invalid service."""
request_data = {
"service": "invalid",
"config": {
"access_key_id": "AKIATEST123",
"secret_access_key": "secret123",
"bucket_name": "test-bucket",
"region": "us-east-1",
"endpoint_url": None,
"path_style": False,
"enabled": True,
},
}
response = api_client.post("/api/v1/backup/test", json=request_data)
assert response.status_code == 400
data = response.json()
assert "Only 's3' service is supported" in data["detail"]
@patch("leggen.services.backup_service.BackupService.list_backups")
def test_list_backups_success(self, mock_list_backups, api_client, mock_config):
"""Test successful backup listing."""
mock_list_backups.return_value = [
{
"key": "leggen_backups/database_backup_20250101_120000.db",
"last_modified": "2025-01-01T12:00:00",
"size": 1024,
},
{
"key": "leggen_backups/database_backup_20250101_110000.db",
"last_modified": "2025-01-01T11:00:00",
"size": 512,
},
]
mock_config._config["backup"] = {
"s3": {
"access_key_id": "AKIATEST123",
"secret_access_key": "secret123",
"bucket_name": "test-bucket",
"region": "us-east-1",
"enabled": True,
}
}
with patch("leggen.utils.config.config", mock_config):
response = api_client.get("/api/v1/backup/list")
assert response.status_code == 200
data = response.json()
assert data["success"] is True
assert len(data["data"]) == 2
assert (
data["data"][0]["key"]
== "leggen_backups/database_backup_20250101_120000.db"
)
def test_list_backups_no_config(self, api_client, mock_config):
"""Test backup listing with no configuration."""
mock_config._config["backup"] = {}
with patch("leggen.utils.config.config", mock_config):
response = api_client.get("/api/v1/backup/list")
assert response.status_code == 200
data = response.json()
assert data["success"] is True
assert data["data"] == []
@patch("leggen.services.backup_service.BackupService.backup_database")
@patch("leggen.utils.paths.path_manager.get_database_path")
def test_backup_operation_success(
self, mock_get_db_path, mock_backup_db, api_client, mock_config
):
"""Test successful backup operation."""
from pathlib import Path
mock_get_db_path.return_value = Path("/test/database.db")
mock_backup_db.return_value = True
mock_config._config["backup"] = {
"s3": {
"access_key_id": "AKIATEST123",
"secret_access_key": "secret123",
"bucket_name": "test-bucket",
"region": "us-east-1",
"enabled": True,
}
}
request_data = {"operation": "backup"}
with patch("leggen.utils.config.config", mock_config):
response = api_client.post("/api/v1/backup/operation", json=request_data)
assert response.status_code == 200
data = response.json()
assert data["success"] is True
assert data["data"]["operation"] == "backup"
assert data["data"]["completed"] is True
# Verify backup was called
mock_backup_db.assert_called_once()
def test_backup_operation_no_config(self, api_client, mock_config):
"""Test backup operation with no configuration."""
mock_config._config["backup"] = {}
request_data = {"operation": "backup"}
with patch("leggen.utils.config.config", mock_config):
response = api_client.post("/api/v1/backup/operation", json=request_data)
assert response.status_code == 400
data = response.json()
assert "S3 backup is not configured" in data["detail"]
def test_backup_operation_invalid_operation(self, api_client, mock_config):
"""Test backup operation with invalid operation type."""
mock_config._config["backup"] = {
"s3": {
"access_key_id": "AKIATEST123",
"secret_access_key": "secret123",
"bucket_name": "test-bucket",
"region": "us-east-1",
"enabled": True,
}
}
request_data = {"operation": "invalid"}
with patch("leggen.utils.config.config", mock_config):
response = api_client.post("/api/v1/backup/operation", json=request_data)
assert response.status_code == 400
data = response.json()
assert "Invalid operation" in data["detail"]

View File

@@ -0,0 +1,226 @@
"""Tests for backup service functionality."""
import tempfile
from pathlib import Path
from unittest.mock import MagicMock, patch
import pytest
from botocore.exceptions import ClientError, NoCredentialsError
from leggen.models.config import S3BackupConfig
from leggen.services.backup_service import BackupService
@pytest.mark.unit
class TestBackupService:
"""Test backup service functionality."""
def test_backup_service_initialization(self):
"""Test backup service can be initialized."""
service = BackupService()
assert service.s3_config is None
assert service._s3_client is None
def test_backup_service_with_config(self):
"""Test backup service initialization with config."""
s3_config = S3BackupConfig(
access_key_id="test-key",
secret_access_key="test-secret",
bucket_name="test-bucket",
region="us-east-1",
)
service = BackupService(s3_config)
assert service.s3_config == s3_config
@pytest.mark.asyncio
async def test_test_connection_success(self):
"""Test successful S3 connection test."""
s3_config = S3BackupConfig(
access_key_id="test-key",
secret_access_key="test-secret",
bucket_name="test-bucket",
region="us-east-1",
)
service = BackupService()
# Mock S3 client
with patch("boto3.Session") as mock_session:
mock_client = MagicMock()
mock_session.return_value.client.return_value = mock_client
# Mock successful list_objects_v2 call
mock_client.list_objects_v2.return_value = {"Contents": []}
result = await service.test_connection(s3_config)
assert result is True
# Verify the client was called correctly
mock_client.list_objects_v2.assert_called_once_with(
Bucket="test-bucket", MaxKeys=1
)
@pytest.mark.asyncio
async def test_test_connection_no_credentials(self):
"""Test S3 connection test with no credentials."""
s3_config = S3BackupConfig(
access_key_id="test-key",
secret_access_key="test-secret",
bucket_name="test-bucket",
region="us-east-1",
)
service = BackupService()
# Mock S3 client to raise NoCredentialsError
with patch("boto3.Session") as mock_session:
mock_client = MagicMock()
mock_session.return_value.client.return_value = mock_client
mock_client.list_objects_v2.side_effect = NoCredentialsError()
result = await service.test_connection(s3_config)
assert result is False
@pytest.mark.asyncio
async def test_test_connection_client_error(self):
"""Test S3 connection test with client error."""
s3_config = S3BackupConfig(
access_key_id="test-key",
secret_access_key="test-secret",
bucket_name="test-bucket",
region="us-east-1",
)
service = BackupService()
# Mock S3 client to raise ClientError
with patch("boto3.Session") as mock_session:
mock_client = MagicMock()
mock_session.return_value.client.return_value = mock_client
error_response = {
"Error": {"Code": "NoSuchBucket", "Message": "Bucket not found"}
}
mock_client.list_objects_v2.side_effect = ClientError(
error_response, "ListObjectsV2"
)
result = await service.test_connection(s3_config)
assert result is False
@pytest.mark.asyncio
async def test_backup_database_no_config(self):
"""Test backup database with no configuration."""
service = BackupService()
with tempfile.TemporaryDirectory() as tmpdir:
db_path = Path(tmpdir) / "test.db"
db_path.write_text("test database content")
result = await service.backup_database(db_path)
assert result is False
@pytest.mark.asyncio
async def test_backup_database_disabled(self):
"""Test backup database with disabled configuration."""
s3_config = S3BackupConfig(
access_key_id="test-key",
secret_access_key="test-secret",
bucket_name="test-bucket",
region="us-east-1",
enabled=False,
)
service = BackupService(s3_config)
with tempfile.TemporaryDirectory() as tmpdir:
db_path = Path(tmpdir) / "test.db"
db_path.write_text("test database content")
result = await service.backup_database(db_path)
assert result is False
@pytest.mark.asyncio
async def test_backup_database_file_not_found(self):
"""Test backup database with non-existent file."""
s3_config = S3BackupConfig(
access_key_id="test-key",
secret_access_key="test-secret",
bucket_name="test-bucket",
region="us-east-1",
)
service = BackupService(s3_config)
non_existent_path = Path("/non/existent/path.db")
result = await service.backup_database(non_existent_path)
assert result is False
@pytest.mark.asyncio
async def test_backup_database_success(self):
"""Test successful database backup."""
s3_config = S3BackupConfig(
access_key_id="test-key",
secret_access_key="test-secret",
bucket_name="test-bucket",
region="us-east-1",
)
service = BackupService(s3_config)
with tempfile.TemporaryDirectory() as tmpdir:
db_path = Path(tmpdir) / "test.db"
db_path.write_text("test database content")
# Mock S3 client
with patch("boto3.Session") as mock_session:
mock_client = MagicMock()
mock_session.return_value.client.return_value = mock_client
result = await service.backup_database(db_path)
assert result is True
# Verify upload_file was called
mock_client.upload_file.assert_called_once()
args = mock_client.upload_file.call_args[0]
assert args[0] == str(db_path) # source file
assert args[1] == "test-bucket" # bucket name
assert args[2].startswith("leggen_backups/database_backup_") # key
@pytest.mark.asyncio
async def test_list_backups_success(self):
"""Test successful backup listing."""
s3_config = S3BackupConfig(
access_key_id="test-key",
secret_access_key="test-secret",
bucket_name="test-bucket",
region="us-east-1",
)
service = BackupService(s3_config)
# Mock S3 client response
with patch("boto3.Session") as mock_session:
mock_client = MagicMock()
mock_session.return_value.client.return_value = mock_client
from datetime import datetime
mock_response = {
"Contents": [
{
"Key": "leggen_backups/database_backup_20250101_120000.db",
"LastModified": datetime(2025, 1, 1, 12, 0, 0),
"Size": 1024,
},
{
"Key": "leggen_backups/database_backup_20250101_130000.db",
"LastModified": datetime(2025, 1, 1, 13, 0, 0),
"Size": 2048,
},
]
}
mock_client.list_objects_v2.return_value = mock_response
backups = await service.list_backups()
assert len(backups) == 2
# Check that backups are sorted by last modified (newest first)
assert backups[0]["last_modified"] > backups[1]["last_modified"]
assert backups[0]["size"] == 2048
assert backups[1]["size"] == 1024

View File

@@ -106,6 +106,11 @@ class TestConfigurablePaths:
# Set custom database path
path_manager.set_database_path(test_db_path)
# Initialize database tables for the custom path
from leggen.services.database import init_database
init_database()
# Test database operations using DatabaseService
database_service = DatabaseService()
balance_data = {

View File

@@ -1,16 +1,54 @@
"""Tests for database service."""
import tempfile
from datetime import datetime
from pathlib import Path
from unittest.mock import patch
import pytest
from leggen.services.database import init_database
from leggen.services.database_service import DatabaseService
from leggen.utils.paths import path_manager
@pytest.fixture
def database_service():
"""Create a database service instance for testing."""
def test_db_path():
"""Create a temporary test database."""
import os
# Create a writable temporary file
fd, temp_path = tempfile.mkstemp(suffix=".db")
os.close(fd) # Close the file descriptor
db_path = Path(temp_path)
# Set the test database path
original_path = path_manager._database_path
path_manager._database_path = db_path
# Reset the engine to use the new database path
import leggen.services.database as db_module
original_engine = db_module._engine
db_module._engine = None
# Initialize database tables
init_database()
yield db_path
# Cleanup - close any sessions first
if db_module._engine:
db_module._engine.dispose()
db_module._engine = original_engine
path_manager._database_path = original_path
if db_path.exists():
db_path.unlink()
@pytest.fixture
def database_service(test_db_path):
"""Create a database service instance for testing with real database."""
return DatabaseService()
@@ -282,6 +320,7 @@ class TestDatabaseService:
"""Test successful balance persistence."""
balance_data = {
"institution_id": "REVOLUT_REVOLT21",
"account_status": "active",
"iban": "LT313250081177977789",
"balances": [
{
@@ -291,26 +330,23 @@ class TestDatabaseService:
],
}
with patch("sqlite3.connect") as mock_connect:
mock_conn = mock_connect.return_value
mock_cursor = mock_conn.cursor.return_value
# Test actual persistence
await database_service._persist_balance_sqlite("test-account-123", balance_data)
await database_service._persist_balance_sqlite(
"test-account-123", balance_data
)
# Verify database operations
mock_connect.assert_called()
mock_cursor.execute.assert_called() # Table creation and insert
mock_conn.commit.assert_called_once()
mock_conn.close.assert_called_once()
# Verify balance was persisted
balances = await database_service.get_balances_from_db("test-account-123")
assert len(balances) == 1
assert balances[0]["account_id"] == "test-account-123"
assert balances[0]["amount"] == 1000.0
assert balances[0]["currency"] == "EUR"
async def test_persist_balance_sqlite_error(self, database_service):
"""Test handling error during balance persistence."""
balance_data = {"balances": []}
with patch("sqlite3.connect") as mock_connect:
mock_connect.side_effect = Exception("Database error")
# Mock get_session to raise an error
with patch("leggen.services.database_service.get_session") as mock_session:
mock_session.side_effect = Exception("Database error")
with pytest.raises(Exception, match="Database error"):
await database_service._persist_balance_sqlite(
@@ -321,52 +357,48 @@ class TestDatabaseService:
self, database_service, sample_transactions_db_format
):
"""Test successful transaction persistence."""
with patch("sqlite3.connect") as mock_connect:
mock_conn = mock_connect.return_value
mock_cursor = mock_conn.cursor.return_value
# Mock fetchone to return (0,) indicating transaction doesn't exist yet
mock_cursor.fetchone.return_value = (0,)
result = await database_service._persist_transactions_sqlite(
"test-account-123", sample_transactions_db_format
)
result = await database_service._persist_transactions_sqlite(
"test-account-123", sample_transactions_db_format
)
# Should return all transactions as new
assert len(result) == 2
# Should return the transactions (assuming no duplicates)
assert len(result) >= 0 # Could be empty if all are duplicates
# Verify database operations
mock_connect.assert_called()
mock_cursor.execute.assert_called()
mock_conn.commit.assert_called_once()
mock_conn.close.assert_called_once()
# Verify transactions were persisted
transactions = await database_service.get_transactions_from_db(
account_id="test-account-123"
)
assert len(transactions) == 2
assert transactions[0]["accountId"] == "test-account-123"
async def test_persist_transactions_sqlite_duplicate_detection(
self, database_service, sample_transactions_db_format
):
"""Test that existing transactions are not returned as new."""
with patch("sqlite3.connect") as mock_connect:
mock_conn = mock_connect.return_value
mock_cursor = mock_conn.cursor.return_value
# Mock fetchone to return (1,) indicating transaction already exists
mock_cursor.fetchone.return_value = (1,)
# First insert
result1 = await database_service._persist_transactions_sqlite(
"test-account-123", sample_transactions_db_format
)
assert len(result1) == 2
result = await database_service._persist_transactions_sqlite(
"test-account-123", sample_transactions_db_format
)
# Second insert (duplicates)
result2 = await database_service._persist_transactions_sqlite(
"test-account-123", sample_transactions_db_format
)
# Should return empty list since all transactions already exist
assert len(result) == 0
# Should return empty list since all transactions already exist
assert len(result2) == 0
# Verify database operations still happened (INSERT OR REPLACE executed)
mock_connect.assert_called()
mock_cursor.execute.assert_called()
mock_conn.commit.assert_called_once()
mock_conn.close.assert_called_once()
# Verify still only 2 transactions in database
transactions = await database_service.get_transactions_from_db(
account_id="test-account-123"
)
assert len(transactions) == 2
async def test_persist_transactions_sqlite_error(self, database_service):
"""Test handling error during transaction persistence."""
with patch("sqlite3.connect") as mock_connect:
mock_connect.side_effect = Exception("Database error")
with patch("leggen.services.database_service.get_session") as mock_session:
mock_session.side_effect = Exception("Database error")
with pytest.raises(Exception, match="Database error"):
await database_service._persist_transactions_sqlite(

185
uv.lock generated
View File

@@ -2,6 +2,20 @@ version = 1
revision = 3
requires-python = "==3.13.*"
[[package]]
name = "alembic"
version = "1.16.5"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "mako" },
{ name = "sqlalchemy" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/9a/ca/4dc52902cf3491892d464f5265a81e9dff094692c8a049a3ed6a05fe7ee8/alembic-1.16.5.tar.gz", hash = "sha256:a88bb7f6e513bd4301ecf4c7f2206fe93f9913f9b48dac3b78babde2d6fe765e", size = 1969868, upload-time = "2025-08-27T18:02:05.668Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/39/4a/4c61d4c84cfd9befb6fa08a702535b27b21fff08c946bc2f6139decbf7f7/alembic-1.16.5-py3-none-any.whl", hash = "sha256:e845dfe090c5ffa7b92593ae6687c5cb1a101e91fa53868497dbd79847f9dbe3", size = 247355, upload-time = "2025-08-27T18:02:07.37Z" },
]
[[package]]
name = "annotated-types"
version = "0.7.0"
@@ -36,6 +50,34 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/d0/ae/9a053dd9229c0fde6b1f1f33f609ccff1ee79ddda364c756a924c6d8563b/APScheduler-3.11.0-py3-none-any.whl", hash = "sha256:fc134ca32e50f5eadcc4938e3a4545ab19131435e851abb40b34d63d5141c6da", size = 64004, upload-time = "2024-11-24T19:39:24.442Z" },
]
[[package]]
name = "boto3"
version = "1.40.36"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "botocore" },
{ name = "jmespath" },
{ name = "s3transfer" },
]
sdist = { url = "https://files.pythonhosted.org/packages/8d/21/7bc857b155e8264c92b6fa8e0860a67dc01a19cbe6ba4342500299f2ae5b/boto3-1.40.36.tar.gz", hash = "sha256:bfc1f3d5c4f5d12b8458406b8972f8794ac57e2da1ee441469e143bc0440a5c3", size = 111552, upload-time = "2025-09-22T19:26:17.357Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/70/4c/428b728d5cf9003f83f735d10dd522945ab20c7d67e6c987909f29be12a0/boto3-1.40.36-py3-none-any.whl", hash = "sha256:d7c1fe033f491f560cd26022a9dcf28baf877ae854f33bc64fffd0df3b9c98be", size = 139345, upload-time = "2025-09-22T19:26:15.194Z" },
]
[[package]]
name = "botocore"
version = "1.40.36"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "jmespath" },
{ name = "python-dateutil" },
{ name = "urllib3" },
]
sdist = { url = "https://files.pythonhosted.org/packages/4b/30/75fdc75933d3bc1c8dd7fbaee771438328b518936906b411075b1eacac93/botocore-1.40.36.tar.gz", hash = "sha256:93386a8dc54173267ddfc6cd8636c9171e021f7c032aa1df3af7de816e3df616", size = 14349583, upload-time = "2025-09-22T19:26:05.957Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/d8/51/95c0324ac20b5bbafad4c89dd610c8e0dd6cbadbb2c8ca66dc95ccde98b8/botocore-1.40.36-py3-none-any.whl", hash = "sha256:d6edf75875e4013cb7078875a1d6c289afb4cc6675d99d80700c692d8d8e0b72", size = 14020478, upload-time = "2025-09-22T19:26:02.054Z" },
]
[[package]]
name = "certifi"
version = "2025.8.3"
@@ -139,6 +181,23 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/42/14/42b2651a2f46b022ccd948bca9f2d5af0fd8929c4eec235b8d6d844fbe67/filelock-3.19.1-py3-none-any.whl", hash = "sha256:d38e30481def20772f5baf097c122c3babc4fcdb7e14e57049eb9d88c6dc017d", size = 15988, upload-time = "2025-08-14T16:56:01.633Z" },
]
[[package]]
name = "greenlet"
version = "3.2.4"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/03/b8/704d753a5a45507a7aab61f18db9509302ed3d0a27ac7e0359ec2905b1a6/greenlet-3.2.4.tar.gz", hash = "sha256:0dca0d95ff849f9a364385f36ab49f50065d76964944638be9691e1832e9f86d", size = 188260, upload-time = "2025-08-07T13:24:33.51Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/49/e8/58c7f85958bda41dafea50497cbd59738c5c43dbbea5ee83d651234398f4/greenlet-3.2.4-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:1a921e542453fe531144e91e1feedf12e07351b1cf6c9e8a3325ea600a715a31", size = 272814, upload-time = "2025-08-07T13:15:50.011Z" },
{ url = "https://files.pythonhosted.org/packages/62/dd/b9f59862e9e257a16e4e610480cfffd29e3fae018a68c2332090b53aac3d/greenlet-3.2.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:cd3c8e693bff0fff6ba55f140bf390fa92c994083f838fece0f63be121334945", size = 641073, upload-time = "2025-08-07T13:42:57.23Z" },
{ url = "https://files.pythonhosted.org/packages/f7/0b/bc13f787394920b23073ca3b6c4a7a21396301ed75a655bcb47196b50e6e/greenlet-3.2.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:710638eb93b1fa52823aa91bf75326f9ecdfd5e0466f00789246a5280f4ba0fc", size = 655191, upload-time = "2025-08-07T13:45:29.752Z" },
{ url = "https://files.pythonhosted.org/packages/f2/d6/6adde57d1345a8d0f14d31e4ab9c23cfe8e2cd39c3baf7674b4b0338d266/greenlet-3.2.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:c5111ccdc9c88f423426df3fd1811bfc40ed66264d35aa373420a34377efc98a", size = 649516, upload-time = "2025-08-07T13:53:16.314Z" },
{ url = "https://files.pythonhosted.org/packages/7f/3b/3a3328a788d4a473889a2d403199932be55b1b0060f4ddd96ee7cdfcad10/greenlet-3.2.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d76383238584e9711e20ebe14db6c88ddcedc1829a9ad31a584389463b5aa504", size = 652169, upload-time = "2025-08-07T13:18:32.861Z" },
{ url = "https://files.pythonhosted.org/packages/ee/43/3cecdc0349359e1a527cbf2e3e28e5f8f06d3343aaf82ca13437a9aa290f/greenlet-3.2.4-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:23768528f2911bcd7e475210822ffb5254ed10d71f4028387e5a99b4c6699671", size = 610497, upload-time = "2025-08-07T13:18:31.636Z" },
{ url = "https://files.pythonhosted.org/packages/b8/19/06b6cf5d604e2c382a6f31cafafd6f33d5dea706f4db7bdab184bad2b21d/greenlet-3.2.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:00fadb3fedccc447f517ee0d3fd8fe49eae949e1cd0f6a611818f4f6fb7dc83b", size = 1121662, upload-time = "2025-08-07T13:42:41.117Z" },
{ url = "https://files.pythonhosted.org/packages/a2/15/0d5e4e1a66fab130d98168fe984c509249c833c1a3c16806b90f253ce7b9/greenlet-3.2.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d25c5091190f2dc0eaa3f950252122edbbadbb682aa7b1ef2f8af0f8c0afefae", size = 1149210, upload-time = "2025-08-07T13:18:24.072Z" },
{ url = "https://files.pythonhosted.org/packages/0b/55/2321e43595e6801e105fcfdee02b34c0f996eb71e6ddffca6b10b7e1d771/greenlet-3.2.4-cp313-cp313-win_amd64.whl", hash = "sha256:554b03b6e73aaabec3745364d6239e9e012d64c68ccd0b8430c64ccc14939a8b", size = 299685, upload-time = "2025-08-07T13:24:38.824Z" },
]
[[package]]
name = "h11"
version = "0.16.0"
@@ -218,12 +277,23 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" },
]
[[package]]
name = "jmespath"
version = "1.0.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/00/2a/e867e8531cf3e36b41201936b7fa7ba7b5702dbef42922193f05c8976cd6/jmespath-1.0.1.tar.gz", hash = "sha256:90261b206d6defd58fdd5e85f478bf633a2901798906be2ad389150c5c60edbe", size = 25843, upload-time = "2022-06-17T18:00:12.224Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/31/b4/b9b800c45527aadd64d5b442f9b932b00648617eb5d63d2c7a6587b7cafc/jmespath-1.0.1-py3-none-any.whl", hash = "sha256:02e2e4cc71b5bcab88332eebf907519190dd9e6e82107fa7f83b1003a6252980", size = 20256, upload-time = "2022-06-17T18:00:10.251Z" },
]
[[package]]
name = "leggen"
version = "2025.9.24"
version = "2025.9.26"
source = { editable = "." }
dependencies = [
{ name = "alembic" },
{ name = "apscheduler" },
{ name = "boto3" },
{ name = "click" },
{ name = "discord-webhook" },
{ name = "fastapi" },
@@ -231,6 +301,7 @@ dependencies = [
{ name = "loguru" },
{ name = "pydantic" },
{ name = "requests" },
{ name = "sqlmodel" },
{ name = "tabulate" },
{ name = "tomli-w" },
{ name = "uvicorn", extra = ["standard"] },
@@ -252,7 +323,9 @@ dev = [
[package.metadata]
requires-dist = [
{ name = "alembic", specifier = ">=1.16.5" },
{ name = "apscheduler", specifier = ">=3.10.0,<4" },
{ name = "boto3", specifier = ">=1.35.0,<2" },
{ name = "click", specifier = ">=8.1.7,<9" },
{ name = "discord-webhook", specifier = ">=1.3.1,<2" },
{ name = "fastapi", specifier = ">=0.104.0,<1" },
@@ -260,6 +333,7 @@ requires-dist = [
{ name = "loguru", specifier = ">=0.7.2,<0.8" },
{ name = "pydantic", specifier = ">=2.0.0,<3" },
{ name = "requests", specifier = ">=2.31.0,<3" },
{ name = "sqlmodel", specifier = ">=0.0.25" },
{ name = "tabulate", specifier = ">=0.9.0,<0.10" },
{ name = "tomli-w", specifier = ">=1.0.0,<2" },
{ name = "uvicorn", extras = ["standard"], specifier = ">=0.24.0,<1" },
@@ -292,6 +366,48 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/0c/29/0348de65b8cc732daa3e33e67806420b2ae89bdce2b04af740289c5c6c8c/loguru-0.7.3-py3-none-any.whl", hash = "sha256:31a33c10c8e1e10422bfd431aeb5d351c7cf7fa671e3c4df004162264b28220c", size = 61595, upload-time = "2024-12-06T11:20:54.538Z" },
]
[[package]]
name = "mako"
version = "1.3.10"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "markupsafe" },
]
sdist = { url = "https://files.pythonhosted.org/packages/9e/38/bd5b78a920a64d708fe6bc8e0a2c075e1389d53bef8413725c63ba041535/mako-1.3.10.tar.gz", hash = "sha256:99579a6f39583fa7e5630a28c3c1f440e4e97a414b80372649c0ce338da2ea28", size = 392474, upload-time = "2025-04-10T12:44:31.16Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/87/fb/99f81ac72ae23375f22b7afdb7642aba97c00a713c217124420147681a2f/mako-1.3.10-py3-none-any.whl", hash = "sha256:baef24a52fc4fc514a0887ac600f9f1cff3d82c61d4d700a1fa84d597b88db59", size = 78509, upload-time = "2025-04-10T12:50:53.297Z" },
]
[[package]]
name = "markupsafe"
version = "3.0.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/7e/99/7690b6d4034fffd95959cbe0c02de8deb3098cc577c67bb6a24fe5d7caa7/markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698", size = 80313, upload-time = "2025-09-27T18:37:40.426Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/38/2f/907b9c7bbba283e68f20259574b13d005c121a0fa4c175f9bed27c4597ff/markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795", size = 11622, upload-time = "2025-09-27T18:36:41.777Z" },
{ url = "https://files.pythonhosted.org/packages/9c/d9/5f7756922cdd676869eca1c4e3c0cd0df60ed30199ffd775e319089cb3ed/markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219", size = 12029, upload-time = "2025-09-27T18:36:43.257Z" },
{ url = "https://files.pythonhosted.org/packages/00/07/575a68c754943058c78f30db02ee03a64b3c638586fba6a6dd56830b30a3/markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6", size = 24374, upload-time = "2025-09-27T18:36:44.508Z" },
{ url = "https://files.pythonhosted.org/packages/a9/21/9b05698b46f218fc0e118e1f8168395c65c8a2c750ae2bab54fc4bd4e0e8/markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676", size = 22980, upload-time = "2025-09-27T18:36:45.385Z" },
{ url = "https://files.pythonhosted.org/packages/7f/71/544260864f893f18b6827315b988c146b559391e6e7e8f7252839b1b846a/markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9", size = 21990, upload-time = "2025-09-27T18:36:46.916Z" },
{ url = "https://files.pythonhosted.org/packages/c2/28/b50fc2f74d1ad761af2f5dcce7492648b983d00a65b8c0e0cb457c82ebbe/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1", size = 23784, upload-time = "2025-09-27T18:36:47.884Z" },
{ url = "https://files.pythonhosted.org/packages/ed/76/104b2aa106a208da8b17a2fb72e033a5a9d7073c68f7e508b94916ed47a9/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc", size = 21588, upload-time = "2025-09-27T18:36:48.82Z" },
{ url = "https://files.pythonhosted.org/packages/b5/99/16a5eb2d140087ebd97180d95249b00a03aa87e29cc224056274f2e45fd6/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12", size = 23041, upload-time = "2025-09-27T18:36:49.797Z" },
{ url = "https://files.pythonhosted.org/packages/19/bc/e7140ed90c5d61d77cea142eed9f9c303f4c4806f60a1044c13e3f1471d0/markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed", size = 14543, upload-time = "2025-09-27T18:36:51.584Z" },
{ url = "https://files.pythonhosted.org/packages/05/73/c4abe620b841b6b791f2edc248f556900667a5a1cf023a6646967ae98335/markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5", size = 15113, upload-time = "2025-09-27T18:36:52.537Z" },
{ url = "https://files.pythonhosted.org/packages/f0/3a/fa34a0f7cfef23cf9500d68cb7c32dd64ffd58a12b09225fb03dd37d5b80/markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485", size = 13911, upload-time = "2025-09-27T18:36:53.513Z" },
{ url = "https://files.pythonhosted.org/packages/e4/d7/e05cd7efe43a88a17a37b3ae96e79a19e846f3f456fe79c57ca61356ef01/markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73", size = 11658, upload-time = "2025-09-27T18:36:54.819Z" },
{ url = "https://files.pythonhosted.org/packages/99/9e/e412117548182ce2148bdeacdda3bb494260c0b0184360fe0d56389b523b/markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37", size = 12066, upload-time = "2025-09-27T18:36:55.714Z" },
{ url = "https://files.pythonhosted.org/packages/bc/e6/fa0ffcda717ef64a5108eaa7b4f5ed28d56122c9a6d70ab8b72f9f715c80/markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19", size = 25639, upload-time = "2025-09-27T18:36:56.908Z" },
{ url = "https://files.pythonhosted.org/packages/96/ec/2102e881fe9d25fc16cb4b25d5f5cde50970967ffa5dddafdb771237062d/markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025", size = 23569, upload-time = "2025-09-27T18:36:57.913Z" },
{ url = "https://files.pythonhosted.org/packages/4b/30/6f2fce1f1f205fc9323255b216ca8a235b15860c34b6798f810f05828e32/markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6", size = 23284, upload-time = "2025-09-27T18:36:58.833Z" },
{ url = "https://files.pythonhosted.org/packages/58/47/4a0ccea4ab9f5dcb6f79c0236d954acb382202721e704223a8aafa38b5c8/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f", size = 24801, upload-time = "2025-09-27T18:36:59.739Z" },
{ url = "https://files.pythonhosted.org/packages/6a/70/3780e9b72180b6fecb83a4814d84c3bf4b4ae4bf0b19c27196104149734c/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb", size = 22769, upload-time = "2025-09-27T18:37:00.719Z" },
{ url = "https://files.pythonhosted.org/packages/98/c5/c03c7f4125180fc215220c035beac6b9cb684bc7a067c84fc69414d315f5/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009", size = 23642, upload-time = "2025-09-27T18:37:01.673Z" },
{ url = "https://files.pythonhosted.org/packages/80/d6/2d1b89f6ca4bff1036499b1e29a1d02d282259f3681540e16563f27ebc23/markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354", size = 14612, upload-time = "2025-09-27T18:37:02.639Z" },
{ url = "https://files.pythonhosted.org/packages/2b/98/e48a4bfba0a0ffcf9925fe2d69240bfaa19c6f7507b8cd09c70684a53c1e/markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218", size = 15200, upload-time = "2025-09-27T18:37:03.582Z" },
{ url = "https://files.pythonhosted.org/packages/0e/72/e3cc540f351f316e9ed0f092757459afbc595824ca724cbc5a5d4263713f/markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287", size = 13973, upload-time = "2025-09-27T18:37:04.929Z" },
]
[[package]]
name = "mypy"
version = "1.17.1"
@@ -474,6 +590,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/2b/b3/7fefc43fb706380144bcd293cc6e446e6f637ddfa8b83f48d1734156b529/pytest_mock-3.15.0-py3-none-any.whl", hash = "sha256:ef2219485fb1bd256b00e7ad7466ce26729b30eadfc7cbcdb4fa9a92ca68db6f", size = 10050, upload-time = "2025-09-04T20:57:47.274Z" },
]
[[package]]
name = "python-dateutil"
version = "2.9.0.post0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "six" },
]
sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" },
]
[[package]]
name = "python-dotenv"
version = "1.1.1"
@@ -565,6 +693,27 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e1/a3/03216a6a86c706df54422612981fb0f9041dbb452c3401501d4a22b942c9/ruff-0.13.0-py3-none-win_arm64.whl", hash = "sha256:ab80525317b1e1d38614addec8ac954f1b3e662de9d59114ecbf771d00cf613e", size = 12312357, upload-time = "2025-09-10T16:25:35.595Z" },
]
[[package]]
name = "s3transfer"
version = "0.14.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "botocore" },
]
sdist = { url = "https://files.pythonhosted.org/packages/62/74/8d69dcb7a9efe8baa2046891735e5dfe433ad558ae23d9e3c14c633d1d58/s3transfer-0.14.0.tar.gz", hash = "sha256:eff12264e7c8b4985074ccce27a3b38a485bb7f7422cc8046fee9be4983e4125", size = 151547, upload-time = "2025-09-09T19:23:31.089Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/48/f0/ae7ca09223a81a1d890b2557186ea015f6e0502e9b8cb8e1813f1d8cfa4e/s3transfer-0.14.0-py3-none-any.whl", hash = "sha256:ea3b790c7077558ed1f02a3072fb3cb992bbbd253392f4b6e9e8976941c7d456", size = 85712, upload-time = "2025-09-09T19:23:30.041Z" },
]
[[package]]
name = "six"
version = "1.17.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" },
]
[[package]]
name = "sniffio"
version = "1.3.1"
@@ -574,6 +723,40 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" },
]
[[package]]
name = "sqlalchemy"
version = "2.0.43"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "greenlet", marker = "platform_machine == 'AMD64' or platform_machine == 'WIN32' or platform_machine == 'aarch64' or platform_machine == 'amd64' or platform_machine == 'ppc64le' or platform_machine == 'win32' or platform_machine == 'x86_64'" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/d7/bc/d59b5d97d27229b0e009bd9098cd81af71c2fa5549c580a0a67b9bed0496/sqlalchemy-2.0.43.tar.gz", hash = "sha256:788bfcef6787a7764169cfe9859fe425bf44559619e1d9f56f5bddf2ebf6f417", size = 9762949, upload-time = "2025-08-11T14:24:58.438Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/41/1c/a7260bd47a6fae7e03768bf66451437b36451143f36b285522b865987ced/sqlalchemy-2.0.43-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e7c08f57f75a2bb62d7ee80a89686a5e5669f199235c6d1dac75cd59374091c3", size = 2130598, upload-time = "2025-08-11T15:51:15.903Z" },
{ url = "https://files.pythonhosted.org/packages/8e/84/8a337454e82388283830b3586ad7847aa9c76fdd4f1df09cdd1f94591873/sqlalchemy-2.0.43-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:14111d22c29efad445cd5021a70a8b42f7d9152d8ba7f73304c4d82460946aaa", size = 2118415, upload-time = "2025-08-11T15:51:17.256Z" },
{ url = "https://files.pythonhosted.org/packages/cf/ff/22ab2328148492c4d71899d62a0e65370ea66c877aea017a244a35733685/sqlalchemy-2.0.43-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:21b27b56eb2f82653168cefe6cb8e970cdaf4f3a6cb2c5e3c3c1cf3158968ff9", size = 3248707, upload-time = "2025-08-11T15:52:38.444Z" },
{ url = "https://files.pythonhosted.org/packages/dc/29/11ae2c2b981de60187f7cbc84277d9d21f101093d1b2e945c63774477aba/sqlalchemy-2.0.43-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c5a9da957c56e43d72126a3f5845603da00e0293720b03bde0aacffcf2dc04f", size = 3253602, upload-time = "2025-08-11T15:56:37.348Z" },
{ url = "https://files.pythonhosted.org/packages/b8/61/987b6c23b12c56d2be451bc70900f67dd7d989d52b1ee64f239cf19aec69/sqlalchemy-2.0.43-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5d79f9fdc9584ec83d1b3c75e9f4595c49017f5594fee1a2217117647225d738", size = 3183248, upload-time = "2025-08-11T15:52:39.865Z" },
{ url = "https://files.pythonhosted.org/packages/86/85/29d216002d4593c2ce1c0ec2cec46dda77bfbcd221e24caa6e85eff53d89/sqlalchemy-2.0.43-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9df7126fd9db49e3a5a3999442cc67e9ee8971f3cb9644250107d7296cb2a164", size = 3219363, upload-time = "2025-08-11T15:56:39.11Z" },
{ url = "https://files.pythonhosted.org/packages/b6/e4/bd78b01919c524f190b4905d47e7630bf4130b9f48fd971ae1c6225b6f6a/sqlalchemy-2.0.43-cp313-cp313-win32.whl", hash = "sha256:7f1ac7828857fcedb0361b48b9ac4821469f7694089d15550bbcf9ab22564a1d", size = 2096718, upload-time = "2025-08-11T15:55:05.349Z" },
{ url = "https://files.pythonhosted.org/packages/ac/a5/ca2f07a2a201f9497de1928f787926613db6307992fe5cda97624eb07c2f/sqlalchemy-2.0.43-cp313-cp313-win_amd64.whl", hash = "sha256:971ba928fcde01869361f504fcff3b7143b47d30de188b11c6357c0505824197", size = 2123200, upload-time = "2025-08-11T15:55:07.932Z" },
{ url = "https://files.pythonhosted.org/packages/b8/d9/13bdde6521f322861fab67473cec4b1cc8999f3871953531cf61945fad92/sqlalchemy-2.0.43-py3-none-any.whl", hash = "sha256:1681c21dd2ccee222c2fe0bef671d1aef7c504087c9c4e800371cfcc8ac966fc", size = 1924759, upload-time = "2025-08-11T15:39:53.024Z" },
]
[[package]]
name = "sqlmodel"
version = "0.0.25"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "pydantic" },
{ name = "sqlalchemy" },
]
sdist = { url = "https://files.pythonhosted.org/packages/ea/80/d9c098a88724ee4554907939cf39590cf67e10c6683723216e228d3315f7/sqlmodel-0.0.25.tar.gz", hash = "sha256:56548c2e645975b1ed94d6c53f0d13c85593f57926a575e2bf566650b2243fa4", size = 117075, upload-time = "2025-09-17T21:44:41.219Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/57/cf/5d175ce8de07fe694ec4e3d4d65c2dd06cc30f6c79599b31f9d2f6dd2830/sqlmodel-0.0.25-py3-none-any.whl", hash = "sha256:c98234cda701fb77e9dcbd81688c23bb251c13bb98ce1dd8d4adc467374d45b7", size = 28893, upload-time = "2025-09-17T21:44:39.764Z" },
]
[[package]]
name = "starlette"
version = "0.47.3"