mirror of
https://github.com/elisiariocouto/leggen.git
synced 2025-12-30 03:29:19 +00:00
Compare commits
12 Commits
cecde13486
...
2025.11.0
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5eecc72219 | ||
|
|
b1b348badb | ||
|
|
d2bc179d59 | ||
|
|
9fee74e2a9 | ||
|
|
7c06a1d8b9 | ||
|
|
d78f481192 | ||
|
|
b32853e8fd | ||
|
|
0750c41b7b | ||
|
|
1cd63731a3 | ||
|
|
38fddeb281 | ||
|
|
0205e5be0d | ||
|
|
ca7968cc3c |
@@ -10,7 +10,6 @@ repos:
|
|||||||
- id: trailing-whitespace
|
- id: trailing-whitespace
|
||||||
exclude: ".*\\.md$"
|
exclude: ".*\\.md$"
|
||||||
- id: end-of-file-fixer
|
- id: end-of-file-fixer
|
||||||
- id: check-added-large-files
|
|
||||||
- repo: local
|
- repo: local
|
||||||
hooks:
|
hooks:
|
||||||
- id: mypy
|
- id: mypy
|
||||||
|
|||||||
@@ -41,7 +41,7 @@ The command outputs instructions for setting the required environment variable t
|
|||||||
uv run leggen server
|
uv run leggen server
|
||||||
```
|
```
|
||||||
- For development mode with auto-reload: `uv run leggen server --reload`
|
- For development mode with auto-reload: `uv run leggen server --reload`
|
||||||
- API will be available at `http://localhost:8000` with docs at `http://localhost:8000/docs`
|
- API will be available at `http://localhost:8000` with docs at `http://localhost:8000/api/v1/docs`
|
||||||
|
|
||||||
### Start the Frontend
|
### Start the Frontend
|
||||||
1. Navigate to the frontend directory: `cd frontend`
|
1. Navigate to the frontend directory: `cd frontend`
|
||||||
|
|||||||
44
CHANGELOG.md
44
CHANGELOG.md
@@ -1,4 +1,48 @@
|
|||||||
|
|
||||||
|
## 2025.11.0 (2025/11/22)
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
|
||||||
|
- **frontend:** Apply iOS safe area insets to body element instead of individual components. ([d2bc179d](https://github.com/elisiariocouto/leggen/commit/d2bc179d5937172a01ebbfffd35e7617f0ac32af))
|
||||||
|
- Fallback to internal_transaction_id when bank transactions do not have transaction_id. ([b1b348ba](https://github.com/elisiariocouto/leggen/commit/b1b348badb5d1ea9c01ef9ecab1003252165468c))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## 2025.10.2 (2025/10/06)
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
|
||||||
|
- **frontend:** Improve nginx config. ([d78f4811](https://github.com/elisiariocouto/leggen/commit/d78f4811922df7e637abe65b1d0b1157dd331c3c))
|
||||||
|
- **frontend:** Include default mime types. ([7c06a1d8](https://github.com/elisiariocouto/leggen/commit/7c06a1d8b9bca3da2c481d9e89e7564cfffe32a3))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## 2025.10.1 (2025/10/05)
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
|
||||||
|
- **frontend:** Fix PWA caching system, remove prompts. ([1cd63731](https://github.com/elisiariocouto/leggen/commit/1cd63731a35a1c77a59d7ae1a898ad8f22e362e4))
|
||||||
|
|
||||||
|
|
||||||
|
### Documentation
|
||||||
|
|
||||||
|
- Improve documentation, add gif showing web app. ([0750c41b](https://github.com/elisiariocouto/leggen/commit/0750c41b7b6634900ec19b1701d58b06346028e3))
|
||||||
|
|
||||||
|
|
||||||
|
### Refactor
|
||||||
|
|
||||||
|
- **frontend:** Standardize button styling using shadcn Button component. ([38fddeb2](https://github.com/elisiariocouto/leggen/commit/38fddeb281588de41d8ff6292c1dd48443a059a4))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## 2025.10.0 (2025/10/01)
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
|
||||||
|
- **gocardless:** Increase timeout to 30 seconds, some requests take some time. ([ca7968cc](https://github.com/elisiariocouto/leggen/commit/ca7968cc3c625e243fe2d75590a9e56f3100072b))
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
## 2025.9.26 (2025/09/30)
|
## 2025.9.26 (2025/09/30)
|
||||||
|
|
||||||
### Debug
|
### Debug
|
||||||
|
|||||||
280
README.md
280
README.md
@@ -1,13 +1,21 @@
|
|||||||
# 💲 leggen
|
# 💲 leggen
|
||||||
|
|
||||||
An Open Banking CLI and API service for managing bank connections and transactions.
|
|
||||||
|
|
||||||
This tool provides a **unified command-line interface** (`leggen`) with both CLI commands and an integrated **FastAPI backend service**, plus a **React Web Interface** to connect to banks using the GoCardless Open Banking API.
|
A self hosted Open Banking Dashboard, API and CLI for managing bank connections and transactions.
|
||||||
|
|
||||||
Having your bank data accessible through both CLI and REST API gives you the power to backup, analyze, create reports, and integrate with other applications.
|
Having your bank data accessible through both CLI and REST API gives you the power to backup, analyze, create reports, and integrate with other applications.
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
## 🛠️ Technologies
|
## 🛠️ Technologies
|
||||||
|
|
||||||
|
### Frontend
|
||||||
|
- [React](https://reactjs.org/): Modern web interface with TypeScript
|
||||||
|
- [Vite](https://vitejs.dev/): Fast build tool and development server
|
||||||
|
- [Tailwind CSS](https://tailwindcss.com/): Utility-first CSS framework
|
||||||
|
- [shadcn/ui](https://ui.shadcn.com/): Modern component system built on Radix UI
|
||||||
|
- [TanStack Query](https://tanstack.com/query): Powerful data synchronization for React
|
||||||
|
|
||||||
### 🔌 API & Backend
|
### 🔌 API & Backend
|
||||||
- [FastAPI](https://fastapi.tiangolo.com/): High-performance async API backend (integrated into `leggen server`)
|
- [FastAPI](https://fastapi.tiangolo.com/): High-performance async API backend (integrated into `leggen server`)
|
||||||
- [GoCardless Open Banking API](https://developer.gocardless.com/bank-account-data/overview): for connecting to banks
|
- [GoCardless Open Banking API](https://developer.gocardless.com/bank-account-data/overview): for connecting to banks
|
||||||
@@ -16,12 +24,6 @@ Having your bank data accessible through both CLI and REST API gives you the pow
|
|||||||
### 📦 Storage
|
### 📦 Storage
|
||||||
- [SQLite](https://www.sqlite.org): for storing transactions, simple and easy to use
|
- [SQLite](https://www.sqlite.org): for storing transactions, simple and easy to use
|
||||||
|
|
||||||
### Frontend
|
|
||||||
- [React](https://reactjs.org/): Modern web interface with TypeScript
|
|
||||||
- [Vite](https://vitejs.dev/): Fast build tool and development server
|
|
||||||
- [Tailwind CSS](https://tailwindcss.com/): Utility-first CSS framework
|
|
||||||
- [shadcn/ui](https://ui.shadcn.com/): Modern component system built on Radix UI
|
|
||||||
- [TanStack Query](https://tanstack.com/query): Powerful data synchronization for React
|
|
||||||
|
|
||||||
## ✨ Features
|
## ✨ Features
|
||||||
|
|
||||||
@@ -54,10 +56,9 @@ Having your bank data accessible through both CLI and REST API gives you the pow
|
|||||||
1. Create a GoCardless account at [https://gocardless.com/bank-account-data/](https://gocardless.com/bank-account-data/)
|
1. Create a GoCardless account at [https://gocardless.com/bank-account-data/](https://gocardless.com/bank-account-data/)
|
||||||
2. Get your API credentials (key and secret)
|
2. Get your API credentials (key and secret)
|
||||||
|
|
||||||
### Installation Options
|
### Installation
|
||||||
|
|
||||||
#### Option 1: Docker Compose (Recommended)
|
#### Docker Compose (Recommended)
|
||||||
The easiest way to get started is with Docker Compose, which includes both the React frontend and FastAPI backend:
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Clone the repository
|
# Clone the repository
|
||||||
@@ -68,50 +69,11 @@ cd leggen
|
|||||||
mkdir -p data && cp config.example.toml data/config.toml
|
mkdir -p data && cp config.example.toml data/config.toml
|
||||||
# Edit data/config.toml with your GoCardless credentials
|
# Edit data/config.toml with your GoCardless credentials
|
||||||
|
|
||||||
# Start all services (frontend + backend)
|
# Start all services
|
||||||
docker compose up -d
|
docker compose up -d
|
||||||
|
|
||||||
# Access the web interface at http://localhost:3000
|
# Access the web interface at http://localhost:3000
|
||||||
# API is available at http://localhost:8000
|
# API documentation at http://localhost:3000/api/v1/docs
|
||||||
```
|
|
||||||
|
|
||||||
#### Production Deployment
|
|
||||||
|
|
||||||
For production deployment using published Docker images:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Clone the repository
|
|
||||||
git clone https://github.com/elisiariocouto/leggen.git
|
|
||||||
cd leggen
|
|
||||||
|
|
||||||
# Create your configuration
|
|
||||||
mkdir -p data && cp config.example.toml data/config.toml
|
|
||||||
# Edit data/config.toml with your GoCardless credentials
|
|
||||||
|
|
||||||
# Start production services
|
|
||||||
docker compose up -d
|
|
||||||
|
|
||||||
# Access the web interface at http://localhost:3000
|
|
||||||
# API is available at http://localhost:8000
|
|
||||||
```
|
|
||||||
|
|
||||||
### Development vs Production
|
|
||||||
|
|
||||||
- **Development**: Use `docker compose -f compose.dev.yml up -d` (builds from source)
|
|
||||||
- **Production**: Use `docker compose up -d` (uses published images)
|
|
||||||
|
|
||||||
#### Option 2: Local Development
|
|
||||||
For development or local installation:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Install with uv (recommended) or pip
|
|
||||||
uv sync # or pip install -e .
|
|
||||||
|
|
||||||
# Start the API service
|
|
||||||
uv run leggen server --reload # Development mode with auto-reload
|
|
||||||
|
|
||||||
# Use the CLI (in another terminal)
|
|
||||||
uv run leggen --help
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Configuration
|
### Configuration
|
||||||
@@ -153,214 +115,22 @@ case_sensitive = ["SpecificStore"]
|
|||||||
|
|
||||||
## 📖 Usage
|
## 📖 Usage
|
||||||
|
|
||||||
### API Service (`leggen server`)
|
### Web Interface
|
||||||
|
Access the React web interface at `http://localhost:3000` after starting the services.
|
||||||
|
|
||||||
Start the FastAPI backend service:
|
### API Service
|
||||||
|
Visit `http://localhost:3000/api/v1/docs` for interactive API documentation.
|
||||||
|
|
||||||
|
### CLI Commands
|
||||||
```bash
|
```bash
|
||||||
# Production mode
|
leggen status # Check connection status
|
||||||
leggen server
|
leggen bank add # Connect to a new bank
|
||||||
|
leggen balances # View account balances
|
||||||
# Development mode with auto-reload
|
leggen transactions # List transactions
|
||||||
leggen server --reload
|
leggen sync # Trigger background sync
|
||||||
|
|
||||||
# Custom host and port
|
|
||||||
leggen server --host 127.0.0.1 --port 8080
|
|
||||||
```
|
```
|
||||||
|
|
||||||
**API Documentation**: Visit `http://localhost:8000/docs` for interactive API documentation.
|
For more options, run `leggen --help` or `leggen <command> --help`.
|
||||||
|
|
||||||
### CLI Commands (`leggen`)
|
|
||||||
|
|
||||||
#### Basic Commands
|
|
||||||
```bash
|
|
||||||
# Check connection status
|
|
||||||
leggen status
|
|
||||||
|
|
||||||
# Connect to a new bank
|
|
||||||
leggen bank add
|
|
||||||
|
|
||||||
# View account balances
|
|
||||||
leggen balances
|
|
||||||
|
|
||||||
# List recent transactions
|
|
||||||
leggen transactions --limit 20
|
|
||||||
|
|
||||||
# View detailed transactions
|
|
||||||
leggen transactions --full
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Sync Operations
|
|
||||||
```bash
|
|
||||||
# Start background sync
|
|
||||||
leggen sync
|
|
||||||
|
|
||||||
# Synchronous sync (wait for completion)
|
|
||||||
leggen sync --wait
|
|
||||||
|
|
||||||
# Force sync (override running sync)
|
|
||||||
leggen sync --force --wait
|
|
||||||
```
|
|
||||||
|
|
||||||
#### API Integration
|
|
||||||
```bash
|
|
||||||
# Use custom API URL
|
|
||||||
leggen --api-url http://localhost:8080 status
|
|
||||||
|
|
||||||
# Set via environment variable
|
|
||||||
export LEGGEN_API_URL=http://localhost:8080
|
|
||||||
leggen status
|
|
||||||
```
|
|
||||||
|
|
||||||
### Docker Usage
|
|
||||||
|
|
||||||
#### Development (build from source)
|
|
||||||
```bash
|
|
||||||
# Start development services
|
|
||||||
docker compose -f compose.dev.yml up -d
|
|
||||||
|
|
||||||
# View service status
|
|
||||||
docker compose -f compose.dev.yml ps
|
|
||||||
|
|
||||||
# Check logs
|
|
||||||
docker compose -f compose.dev.yml logs frontend
|
|
||||||
docker compose -f compose.dev.yml logs leggen-server
|
|
||||||
|
|
||||||
# Stop development services
|
|
||||||
docker compose -f compose.dev.yml down
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Production (use published images)
|
|
||||||
```bash
|
|
||||||
# Start production services
|
|
||||||
docker compose up -d
|
|
||||||
|
|
||||||
# View service status
|
|
||||||
docker compose ps
|
|
||||||
|
|
||||||
# Check logs
|
|
||||||
docker compose logs frontend
|
|
||||||
docker compose logs leggen-server
|
|
||||||
|
|
||||||
# Access the web interface at http://localhost:3000
|
|
||||||
# API documentation at http://localhost:8000/docs
|
|
||||||
|
|
||||||
# Stop production services
|
|
||||||
docker compose down
|
|
||||||
```
|
|
||||||
|
|
||||||
## 🔌 API Endpoints
|
|
||||||
|
|
||||||
The FastAPI backend provides comprehensive REST endpoints:
|
|
||||||
|
|
||||||
### Banks & Connections
|
|
||||||
- `GET /api/v1/banks/institutions?country=PT` - List available banks
|
|
||||||
- `POST /api/v1/banks/connect` - Create bank connection
|
|
||||||
- `GET /api/v1/banks/status` - Connection status
|
|
||||||
- `GET /api/v1/banks/countries` - Supported countries
|
|
||||||
|
|
||||||
### Accounts & Balances
|
|
||||||
- `GET /api/v1/accounts` - List all accounts
|
|
||||||
- `GET /api/v1/accounts/{id}` - Account details
|
|
||||||
- `GET /api/v1/accounts/{id}/balances` - Account balances
|
|
||||||
- `GET /api/v1/accounts/{id}/transactions` - Account transactions
|
|
||||||
|
|
||||||
### Transactions
|
|
||||||
- `GET /api/v1/transactions` - All transactions with filtering
|
|
||||||
- `GET /api/v1/transactions/stats` - Transaction statistics
|
|
||||||
|
|
||||||
### Sync & Scheduling
|
|
||||||
- `POST /api/v1/sync` - Trigger background sync
|
|
||||||
- `POST /api/v1/sync/now` - Synchronous sync
|
|
||||||
- `GET /api/v1/sync/status` - Sync status
|
|
||||||
- `GET/PUT /api/v1/sync/scheduler` - Scheduler configuration
|
|
||||||
|
|
||||||
### Notifications
|
|
||||||
- `GET/PUT /api/v1/notifications/settings` - Manage notifications
|
|
||||||
- `POST /api/v1/notifications/test` - Test notifications
|
|
||||||
|
|
||||||
## 🛠️ Development
|
|
||||||
|
|
||||||
### Local Development Setup
|
|
||||||
```bash
|
|
||||||
# Clone and setup
|
|
||||||
git clone https://github.com/elisiariocouto/leggen.git
|
|
||||||
cd leggen
|
|
||||||
|
|
||||||
# Install dependencies
|
|
||||||
uv sync
|
|
||||||
|
|
||||||
# Start API service with auto-reload
|
|
||||||
uv run leggen server --reload
|
|
||||||
|
|
||||||
# Use CLI commands
|
|
||||||
uv run leggen status
|
|
||||||
```
|
|
||||||
|
|
||||||
### Testing
|
|
||||||
|
|
||||||
Run the comprehensive test suite with:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Run all tests
|
|
||||||
uv run pytest
|
|
||||||
|
|
||||||
# Run unit tests only
|
|
||||||
uv run pytest tests/unit/
|
|
||||||
|
|
||||||
# Run with verbose output
|
|
||||||
uv run pytest tests/unit/ -v
|
|
||||||
|
|
||||||
# Run specific test files
|
|
||||||
uv run pytest tests/unit/test_config.py -v
|
|
||||||
uv run pytest tests/unit/test_scheduler.py -v
|
|
||||||
uv run pytest tests/unit/test_api_banks.py -v
|
|
||||||
|
|
||||||
# Run tests by markers
|
|
||||||
uv run pytest -m unit # Unit tests
|
|
||||||
uv run pytest -m api # API endpoint tests
|
|
||||||
uv run pytest -m cli # CLI tests
|
|
||||||
```
|
|
||||||
|
|
||||||
The test suite includes:
|
|
||||||
- **Configuration management tests** - TOML config loading/saving
|
|
||||||
- **API endpoint tests** - FastAPI route testing with mocked dependencies
|
|
||||||
- **CLI API client tests** - HTTP client integration testing
|
|
||||||
- **Background scheduler tests** - APScheduler job management
|
|
||||||
- **Mock data and fixtures** - Realistic test data for banks, accounts, transactions
|
|
||||||
|
|
||||||
### Code Structure
|
|
||||||
```
|
|
||||||
leggen/ # CLI application
|
|
||||||
├── commands/ # CLI command implementations
|
|
||||||
├── utils/ # Shared utilities
|
|
||||||
├── api/ # FastAPI API routes and models
|
|
||||||
├── services/ # Business logic
|
|
||||||
├── background/ # Background job scheduler
|
|
||||||
└── api_client.py # API client for server communication
|
|
||||||
|
|
||||||
tests/ # Test suite
|
|
||||||
├── conftest.py # Shared test fixtures
|
|
||||||
└── unit/ # Unit tests
|
|
||||||
├── test_config.py # Configuration tests
|
|
||||||
├── test_scheduler.py # Background scheduler tests
|
|
||||||
├── test_api_banks.py # Banks API tests
|
|
||||||
├── test_api_accounts.py # Accounts API tests
|
|
||||||
└── test_api_client.py # CLI API client tests
|
|
||||||
```
|
|
||||||
|
|
||||||
### Contributing
|
|
||||||
1. Fork the repository
|
|
||||||
2. Create a feature branch
|
|
||||||
3. Make your changes with tests
|
|
||||||
4. Submit a pull request
|
|
||||||
|
|
||||||
The repository uses GitHub Actions for CI/CD:
|
|
||||||
- **CI**: Runs Python tests (`uv run pytest`) and frontend linting/build on every push
|
|
||||||
- **Release**: Creates GitHub releases with changelog when tags are pushed
|
|
||||||
|
|
||||||
## ⚠️ Notes
|
## ⚠️ Notes
|
||||||
- This project is in active development
|
- This project is in active development
|
||||||
- GoCardless API rate limits apply
|
|
||||||
- Some banks may require additional authorization steps
|
|
||||||
- Docker images are automatically built and published on releases
|
|
||||||
|
|||||||
148
alembic.ini
148
alembic.ini
@@ -1,148 +0,0 @@
|
|||||||
# A generic, single database configuration.
|
|
||||||
|
|
||||||
[alembic]
|
|
||||||
# path to migration scripts.
|
|
||||||
# this is typically a path given in POSIX (e.g. forward slashes)
|
|
||||||
# format, relative to the token %(here)s which refers to the location of this
|
|
||||||
# ini file
|
|
||||||
script_location = %(here)s/alembic
|
|
||||||
|
|
||||||
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
|
|
||||||
# Uncomment the line below if you want the files to be prepended with date and time
|
|
||||||
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
|
|
||||||
# for all available tokens
|
|
||||||
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
|
|
||||||
|
|
||||||
# sys.path path, will be prepended to sys.path if present.
|
|
||||||
# defaults to the current working directory. for multiple paths, the path separator
|
|
||||||
# is defined by "path_separator" below.
|
|
||||||
prepend_sys_path = .
|
|
||||||
|
|
||||||
|
|
||||||
# timezone to use when rendering the date within the migration file
|
|
||||||
# as well as the filename.
|
|
||||||
# If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
|
|
||||||
# Any required deps can installed by adding `alembic[tz]` to the pip requirements
|
|
||||||
# string value is passed to ZoneInfo()
|
|
||||||
# leave blank for localtime
|
|
||||||
# timezone =
|
|
||||||
|
|
||||||
# max length of characters to apply to the "slug" field
|
|
||||||
# truncate_slug_length = 40
|
|
||||||
|
|
||||||
# set to 'true' to run the environment during
|
|
||||||
# the 'revision' command, regardless of autogenerate
|
|
||||||
# revision_environment = false
|
|
||||||
|
|
||||||
# set to 'true' to allow .pyc and .pyo files without
|
|
||||||
# a source .py file to be detected as revisions in the
|
|
||||||
# versions/ directory
|
|
||||||
# sourceless = false
|
|
||||||
|
|
||||||
# version location specification; This defaults
|
|
||||||
# to <script_location>/versions. When using multiple version
|
|
||||||
# directories, initial revisions must be specified with --version-path.
|
|
||||||
# The path separator used here should be the separator specified by "path_separator"
|
|
||||||
# below.
|
|
||||||
# version_locations = %(here)s/bar:%(here)s/bat:%(here)s/alembic/versions
|
|
||||||
|
|
||||||
# path_separator; This indicates what character is used to split lists of file
|
|
||||||
# paths, including version_locations and prepend_sys_path within configparser
|
|
||||||
# files such as alembic.ini.
|
|
||||||
# The default rendered in new alembic.ini files is "os", which uses os.pathsep
|
|
||||||
# to provide os-dependent path splitting.
|
|
||||||
#
|
|
||||||
# Note that in order to support legacy alembic.ini files, this default does NOT
|
|
||||||
# take place if path_separator is not present in alembic.ini. If this
|
|
||||||
# option is omitted entirely, fallback logic is as follows:
|
|
||||||
#
|
|
||||||
# 1. Parsing of the version_locations option falls back to using the legacy
|
|
||||||
# "version_path_separator" key, which if absent then falls back to the legacy
|
|
||||||
# behavior of splitting on spaces and/or commas.
|
|
||||||
# 2. Parsing of the prepend_sys_path option falls back to the legacy
|
|
||||||
# behavior of splitting on spaces, commas, or colons.
|
|
||||||
#
|
|
||||||
# Valid values for path_separator are:
|
|
||||||
#
|
|
||||||
# path_separator = :
|
|
||||||
# path_separator = ;
|
|
||||||
# path_separator = space
|
|
||||||
# path_separator = newline
|
|
||||||
#
|
|
||||||
# Use os.pathsep. Default configuration used for new projects.
|
|
||||||
path_separator = os
|
|
||||||
|
|
||||||
# set to 'true' to search source files recursively
|
|
||||||
# in each "version_locations" directory
|
|
||||||
# new in Alembic version 1.10
|
|
||||||
# recursive_version_locations = false
|
|
||||||
|
|
||||||
# the output encoding used when revision files
|
|
||||||
# are written from script.py.mako
|
|
||||||
# output_encoding = utf-8
|
|
||||||
|
|
||||||
# database URL. This is consumed by the user-maintained env.py script only.
|
|
||||||
# other means of configuring database URLs may be customized within the env.py
|
|
||||||
# file.
|
|
||||||
# Note: The actual URL is configured programmatically in env.py
|
|
||||||
# sqlalchemy.url = driver://user:pass@localhost/dbname
|
|
||||||
|
|
||||||
|
|
||||||
[post_write_hooks]
|
|
||||||
# post_write_hooks defines scripts or Python functions that are run
|
|
||||||
# on newly generated revision scripts. See the documentation for further
|
|
||||||
# detail and examples
|
|
||||||
|
|
||||||
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
|
||||||
# hooks = black
|
|
||||||
# black.type = console_scripts
|
|
||||||
# black.entrypoint = black
|
|
||||||
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
|
||||||
|
|
||||||
# lint with attempts to fix using "ruff" - use the module runner, against the "ruff" module
|
|
||||||
# hooks = ruff
|
|
||||||
# ruff.type = module
|
|
||||||
# ruff.module = ruff
|
|
||||||
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
|
|
||||||
|
|
||||||
# Alternatively, use the exec runner to execute a binary found on your PATH
|
|
||||||
# hooks = ruff
|
|
||||||
# ruff.type = exec
|
|
||||||
# ruff.executable = ruff
|
|
||||||
# ruff.options = check --fix REVISION_SCRIPT_FILENAME
|
|
||||||
|
|
||||||
# Logging configuration. This is also consumed by the user-maintained
|
|
||||||
# env.py script only.
|
|
||||||
[loggers]
|
|
||||||
keys = root,sqlalchemy,alembic
|
|
||||||
|
|
||||||
[handlers]
|
|
||||||
keys = console
|
|
||||||
|
|
||||||
[formatters]
|
|
||||||
keys = generic
|
|
||||||
|
|
||||||
[logger_root]
|
|
||||||
level = WARNING
|
|
||||||
handlers = console
|
|
||||||
qualname =
|
|
||||||
|
|
||||||
[logger_sqlalchemy]
|
|
||||||
level = WARNING
|
|
||||||
handlers =
|
|
||||||
qualname = sqlalchemy.engine
|
|
||||||
|
|
||||||
[logger_alembic]
|
|
||||||
level = INFO
|
|
||||||
handlers =
|
|
||||||
qualname = alembic
|
|
||||||
|
|
||||||
[handler_console]
|
|
||||||
class = StreamHandler
|
|
||||||
args = (sys.stderr,)
|
|
||||||
level = NOTSET
|
|
||||||
formatter = generic
|
|
||||||
|
|
||||||
[formatter_generic]
|
|
||||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
|
||||||
datefmt = %H:%M:%S
|
|
||||||
@@ -1 +0,0 @@
|
|||||||
Generic single-database configuration.
|
|
||||||
@@ -1,78 +0,0 @@
|
|||||||
from logging.config import fileConfig
|
|
||||||
|
|
||||||
from sqlalchemy import engine_from_config, pool
|
|
||||||
|
|
||||||
from alembic import context
|
|
||||||
from leggen.models.database import SQLModel
|
|
||||||
from leggen.services.database import get_database_url
|
|
||||||
|
|
||||||
# this is the Alembic Config object, which provides
|
|
||||||
# access to the values within the .ini file in use.
|
|
||||||
config = context.config
|
|
||||||
|
|
||||||
# Set the database URL from our configuration
|
|
||||||
config.set_main_option("sqlalchemy.url", get_database_url())
|
|
||||||
|
|
||||||
# Interpret the config file for Python logging.
|
|
||||||
# This line sets up loggers basically.
|
|
||||||
if config.config_file_name is not None:
|
|
||||||
fileConfig(config.config_file_name)
|
|
||||||
|
|
||||||
# add your model's MetaData object here
|
|
||||||
# for 'autogenerate' support
|
|
||||||
target_metadata = SQLModel.metadata
|
|
||||||
|
|
||||||
# other values from the config, defined by the needs of env.py,
|
|
||||||
# can be acquired:
|
|
||||||
# my_important_option = config.get_main_option("my_important_option")
|
|
||||||
# ... etc.
|
|
||||||
|
|
||||||
|
|
||||||
def run_migrations_offline() -> None:
|
|
||||||
"""Run migrations in 'offline' mode.
|
|
||||||
|
|
||||||
This configures the context with just a URL
|
|
||||||
and not an Engine, though an Engine is acceptable
|
|
||||||
here as well. By skipping the Engine creation
|
|
||||||
we don't even need a DBAPI to be available.
|
|
||||||
|
|
||||||
Calls to context.execute() here emit the given string to the
|
|
||||||
script output.
|
|
||||||
|
|
||||||
"""
|
|
||||||
url = config.get_main_option("sqlalchemy.url")
|
|
||||||
context.configure(
|
|
||||||
url=url,
|
|
||||||
target_metadata=target_metadata,
|
|
||||||
literal_binds=True,
|
|
||||||
dialect_opts={"paramstyle": "named"},
|
|
||||||
)
|
|
||||||
|
|
||||||
with context.begin_transaction():
|
|
||||||
context.run_migrations()
|
|
||||||
|
|
||||||
|
|
||||||
def run_migrations_online() -> None:
|
|
||||||
"""Run migrations in 'online' mode.
|
|
||||||
|
|
||||||
In this scenario we need to create an Engine
|
|
||||||
and associate a connection with the context.
|
|
||||||
|
|
||||||
"""
|
|
||||||
connectable = engine_from_config(
|
|
||||||
config.get_section(config.config_ini_section, {}),
|
|
||||||
prefix="sqlalchemy.",
|
|
||||||
poolclass=pool.NullPool,
|
|
||||||
)
|
|
||||||
|
|
||||||
with connectable.connect() as connection:
|
|
||||||
context.configure(connection=connection, target_metadata=target_metadata)
|
|
||||||
|
|
||||||
with context.begin_transaction():
|
|
||||||
context.run_migrations()
|
|
||||||
|
|
||||||
|
|
||||||
if context.is_offline_mode():
|
|
||||||
run_migrations_offline()
|
|
||||||
else:
|
|
||||||
run_migrations_online()
|
|
||||||
@@ -1,28 +0,0 @@
|
|||||||
"""${message}
|
|
||||||
|
|
||||||
Revision ID: ${up_revision}
|
|
||||||
Revises: ${down_revision | comma,n}
|
|
||||||
Create Date: ${create_date}
|
|
||||||
|
|
||||||
"""
|
|
||||||
from typing import Sequence, Union
|
|
||||||
|
|
||||||
from alembic import op
|
|
||||||
import sqlalchemy as sa
|
|
||||||
${imports if imports else ""}
|
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
|
||||||
revision: str = ${repr(up_revision)}
|
|
||||||
down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)}
|
|
||||||
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
|
|
||||||
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}
|
|
||||||
|
|
||||||
|
|
||||||
def upgrade() -> None:
|
|
||||||
"""Upgrade schema."""
|
|
||||||
${upgrades if upgrades else "pass"}
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade() -> None:
|
|
||||||
"""Downgrade schema."""
|
|
||||||
${downgrades if downgrades else "pass"}
|
|
||||||
@@ -1,102 +0,0 @@
|
|||||||
"""migrate_to_composite_key
|
|
||||||
|
|
||||||
Migrate transactions table to use composite primary key (accountId, transactionId).
|
|
||||||
|
|
||||||
Revision ID: 1ba02efe481c
|
|
||||||
Revises: bf30246cb723
|
|
||||||
Create Date: 2025-09-30 23:16:34.637762
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import Sequence, Union
|
|
||||||
|
|
||||||
from sqlalchemy import text
|
|
||||||
|
|
||||||
from alembic import op
|
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
|
||||||
revision: str = "1ba02efe481c"
|
|
||||||
down_revision: Union[str, Sequence[str], None] = "bf30246cb723"
|
|
||||||
branch_labels: Union[str, Sequence[str], None] = None
|
|
||||||
depends_on: Union[str, Sequence[str], None] = None
|
|
||||||
|
|
||||||
|
|
||||||
def upgrade() -> None:
|
|
||||||
"""Migrate to composite primary key."""
|
|
||||||
conn = op.get_bind()
|
|
||||||
|
|
||||||
# Check if migration is needed
|
|
||||||
result = conn.execute(
|
|
||||||
text("""
|
|
||||||
SELECT name FROM sqlite_master
|
|
||||||
WHERE type='table' AND name='transactions'
|
|
||||||
""")
|
|
||||||
)
|
|
||||||
|
|
||||||
if not result.fetchone():
|
|
||||||
return
|
|
||||||
|
|
||||||
# Create temporary table with new schema
|
|
||||||
op.execute("""
|
|
||||||
CREATE TABLE transactions_temp (
|
|
||||||
accountId TEXT NOT NULL,
|
|
||||||
transactionId TEXT NOT NULL,
|
|
||||||
internalTransactionId TEXT,
|
|
||||||
institutionId TEXT NOT NULL,
|
|
||||||
iban TEXT,
|
|
||||||
transactionDate DATETIME,
|
|
||||||
description TEXT,
|
|
||||||
transactionValue REAL,
|
|
||||||
transactionCurrency TEXT,
|
|
||||||
transactionStatus TEXT,
|
|
||||||
rawTransaction JSON NOT NULL,
|
|
||||||
PRIMARY KEY (accountId, transactionId)
|
|
||||||
)
|
|
||||||
""")
|
|
||||||
|
|
||||||
# Insert deduplicated data (keep most recent duplicate)
|
|
||||||
op.execute("""
|
|
||||||
INSERT INTO transactions_temp
|
|
||||||
SELECT
|
|
||||||
accountId,
|
|
||||||
json_extract(rawTransaction, '$.transactionId') as transactionId,
|
|
||||||
internalTransactionId,
|
|
||||||
institutionId,
|
|
||||||
iban,
|
|
||||||
transactionDate,
|
|
||||||
description,
|
|
||||||
transactionValue,
|
|
||||||
transactionCurrency,
|
|
||||||
transactionStatus,
|
|
||||||
rawTransaction
|
|
||||||
FROM (
|
|
||||||
SELECT *,
|
|
||||||
ROW_NUMBER() OVER (
|
|
||||||
PARTITION BY accountId, json_extract(rawTransaction, '$.transactionId')
|
|
||||||
ORDER BY transactionDate DESC, rowid DESC
|
|
||||||
) as rn
|
|
||||||
FROM transactions
|
|
||||||
WHERE json_extract(rawTransaction, '$.transactionId') IS NOT NULL
|
|
||||||
AND accountId IS NOT NULL
|
|
||||||
) WHERE rn = 1
|
|
||||||
""")
|
|
||||||
|
|
||||||
# Replace tables
|
|
||||||
op.execute("DROP TABLE transactions")
|
|
||||||
op.execute("ALTER TABLE transactions_temp RENAME TO transactions")
|
|
||||||
|
|
||||||
# Recreate indexes
|
|
||||||
op.create_index(
|
|
||||||
"idx_transactions_internal_id", "transactions", ["internalTransactionId"]
|
|
||||||
)
|
|
||||||
op.create_index("idx_transactions_date", "transactions", ["transactionDate"])
|
|
||||||
op.create_index(
|
|
||||||
"idx_transactions_account_date",
|
|
||||||
"transactions",
|
|
||||||
["accountId", "transactionDate"],
|
|
||||||
)
|
|
||||||
op.create_index("idx_transactions_amount", "transactions", ["transactionValue"])
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade() -> None:
|
|
||||||
"""Not implemented - would require changing primary key back."""
|
|
||||||
@@ -1,56 +0,0 @@
|
|||||||
"""add_transaction_enrichments_table
|
|
||||||
|
|
||||||
Add transaction_enrichments table for storing enriched transaction data.
|
|
||||||
|
|
||||||
Revision ID: 4819c868ebc1
|
|
||||||
Revises: dd9f6a55604c
|
|
||||||
Create Date: 2025-09-30 23:20:00.969614
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import Sequence, Union
|
|
||||||
|
|
||||||
import sqlalchemy as sa
|
|
||||||
|
|
||||||
from alembic import op
|
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
|
||||||
revision: str = "4819c868ebc1"
|
|
||||||
down_revision: Union[str, Sequence[str], None] = "dd9f6a55604c"
|
|
||||||
branch_labels: Union[str, Sequence[str], None] = None
|
|
||||||
depends_on: Union[str, Sequence[str], None] = None
|
|
||||||
|
|
||||||
|
|
||||||
def upgrade() -> None:
|
|
||||||
"""Create transaction_enrichments table."""
|
|
||||||
op.create_table(
|
|
||||||
"transaction_enrichments",
|
|
||||||
sa.Column("accountId", sa.String(), nullable=False),
|
|
||||||
sa.Column("transactionId", sa.String(), nullable=False),
|
|
||||||
sa.Column("clean_name", sa.String(), nullable=True),
|
|
||||||
sa.Column("category", sa.String(), nullable=True),
|
|
||||||
sa.Column("logo_url", sa.String(), nullable=True),
|
|
||||||
sa.Column("created_at", sa.DateTime(), nullable=False),
|
|
||||||
sa.Column("updated_at", sa.DateTime(), nullable=False),
|
|
||||||
sa.ForeignKeyConstraint(
|
|
||||||
["accountId", "transactionId"],
|
|
||||||
["transactions.accountId", "transactions.transactionId"],
|
|
||||||
ondelete="CASCADE",
|
|
||||||
),
|
|
||||||
sa.PrimaryKeyConstraint("accountId", "transactionId"),
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create indexes
|
|
||||||
op.create_index(
|
|
||||||
"idx_transaction_enrichments_category", "transaction_enrichments", ["category"]
|
|
||||||
)
|
|
||||||
op.create_index(
|
|
||||||
"idx_transaction_enrichments_clean_name",
|
|
||||||
"transaction_enrichments",
|
|
||||||
["clean_name"],
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade() -> None:
|
|
||||||
"""Drop transaction_enrichments table."""
|
|
||||||
op.drop_table("transaction_enrichments")
|
|
||||||
@@ -1,33 +0,0 @@
|
|||||||
"""add_display_name_column
|
|
||||||
|
|
||||||
Add display_name column to accounts table.
|
|
||||||
|
|
||||||
Revision ID: be8d5807feca
|
|
||||||
Revises: 1ba02efe481c
|
|
||||||
Create Date: 2025-09-30 23:16:34.929968
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import Sequence, Union
|
|
||||||
|
|
||||||
import sqlalchemy as sa
|
|
||||||
|
|
||||||
from alembic import op
|
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
|
||||||
revision: str = "be8d5807feca"
|
|
||||||
down_revision: Union[str, Sequence[str], None] = "1ba02efe481c"
|
|
||||||
branch_labels: Union[str, Sequence[str], None] = None
|
|
||||||
depends_on: Union[str, Sequence[str], None] = None
|
|
||||||
|
|
||||||
|
|
||||||
def upgrade() -> None:
|
|
||||||
"""Add display_name column to accounts table."""
|
|
||||||
with op.batch_alter_table("accounts", schema=None) as batch_op:
|
|
||||||
batch_op.add_column(sa.Column("display_name", sa.String(), nullable=True))
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade() -> None:
|
|
||||||
"""Remove display_name column."""
|
|
||||||
with op.batch_alter_table("accounts", schema=None) as batch_op:
|
|
||||||
batch_op.drop_column("display_name")
|
|
||||||
@@ -1,62 +0,0 @@
|
|||||||
"""migrate_balance_timestamps
|
|
||||||
|
|
||||||
Convert Unix timestamps to datetime strings in balances table.
|
|
||||||
|
|
||||||
Revision ID: bf30246cb723
|
|
||||||
Revises: de8bfb1169d4
|
|
||||||
Create Date: 2025-09-30 23:14:03.128959
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
from datetime import datetime
|
|
||||||
from typing import Sequence, Union
|
|
||||||
|
|
||||||
from sqlalchemy import text
|
|
||||||
|
|
||||||
from alembic import op
|
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
|
||||||
revision: str = "bf30246cb723"
|
|
||||||
down_revision: Union[str, Sequence[str], None] = "de8bfb1169d4"
|
|
||||||
branch_labels: Union[str, Sequence[str], None] = None
|
|
||||||
depends_on: Union[str, Sequence[str], None] = None
|
|
||||||
|
|
||||||
|
|
||||||
def upgrade() -> None:
|
|
||||||
"""Convert all Unix timestamps to datetime strings."""
|
|
||||||
conn = op.get_bind()
|
|
||||||
|
|
||||||
# Get all balances with REAL timestamps
|
|
||||||
result = conn.execute(
|
|
||||||
text("""
|
|
||||||
SELECT id, timestamp
|
|
||||||
FROM balances
|
|
||||||
WHERE typeof(timestamp) = 'real'
|
|
||||||
ORDER BY id
|
|
||||||
""")
|
|
||||||
)
|
|
||||||
|
|
||||||
unix_records = result.fetchall()
|
|
||||||
|
|
||||||
if not unix_records:
|
|
||||||
return
|
|
||||||
|
|
||||||
# Convert and update in batches
|
|
||||||
for record_id, unix_timestamp in unix_records:
|
|
||||||
try:
|
|
||||||
# Convert Unix timestamp to datetime string
|
|
||||||
dt_string = datetime.fromtimestamp(float(unix_timestamp)).isoformat()
|
|
||||||
|
|
||||||
# Update the record
|
|
||||||
conn.execute(
|
|
||||||
text("UPDATE balances SET timestamp = :dt WHERE id = :id"),
|
|
||||||
{"dt": dt_string, "id": record_id},
|
|
||||||
)
|
|
||||||
except Exception:
|
|
||||||
continue
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade() -> None:
|
|
||||||
"""Not implemented - converting back would lose precision."""
|
|
||||||
@@ -1,33 +0,0 @@
|
|||||||
"""add_logo_column
|
|
||||||
|
|
||||||
Add logo column to accounts table.
|
|
||||||
|
|
||||||
Revision ID: dd9f6a55604c
|
|
||||||
Revises: f854fd498a6e
|
|
||||||
Create Date: 2025-09-30 23:16:35.530858
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import Sequence, Union
|
|
||||||
|
|
||||||
import sqlalchemy as sa
|
|
||||||
|
|
||||||
from alembic import op
|
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
|
||||||
revision: str = "dd9f6a55604c"
|
|
||||||
down_revision: Union[str, Sequence[str], None] = "f854fd498a6e"
|
|
||||||
branch_labels: Union[str, Sequence[str], None] = None
|
|
||||||
depends_on: Union[str, Sequence[str], None] = None
|
|
||||||
|
|
||||||
|
|
||||||
def upgrade() -> None:
|
|
||||||
"""Add logo column to accounts table."""
|
|
||||||
with op.batch_alter_table("accounts", schema=None) as batch_op:
|
|
||||||
batch_op.add_column(sa.Column("logo", sa.String(), nullable=True))
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade() -> None:
|
|
||||||
"""Remove logo column."""
|
|
||||||
with op.batch_alter_table("accounts", schema=None) as batch_op:
|
|
||||||
batch_op.drop_column("logo")
|
|
||||||
@@ -1,95 +0,0 @@
|
|||||||
"""create_initial_tables
|
|
||||||
|
|
||||||
Revision ID: de8bfb1169d4
|
|
||||||
Revises:
|
|
||||||
Create Date: 2025-09-30 23:09:24.255875
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import Sequence, Union
|
|
||||||
|
|
||||||
import sqlalchemy as sa
|
|
||||||
|
|
||||||
from alembic import op
|
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
|
||||||
revision: str = "de8bfb1169d4"
|
|
||||||
down_revision: Union[str, Sequence[str], None] = None
|
|
||||||
branch_labels: Union[str, Sequence[str], None] = None
|
|
||||||
depends_on: Union[str, Sequence[str], None] = None
|
|
||||||
|
|
||||||
|
|
||||||
def upgrade() -> None:
|
|
||||||
"""Create initial database tables."""
|
|
||||||
# Create accounts table
|
|
||||||
op.create_table(
|
|
||||||
"accounts",
|
|
||||||
sa.Column("id", sa.String(), nullable=False),
|
|
||||||
sa.Column("institution_id", sa.String(), nullable=False),
|
|
||||||
sa.Column("status", sa.String(), nullable=False),
|
|
||||||
sa.Column("iban", sa.String(), nullable=True),
|
|
||||||
sa.Column("name", sa.String(), nullable=True),
|
|
||||||
sa.Column("currency", sa.String(), nullable=True),
|
|
||||||
sa.Column("created", sa.DateTime(), nullable=False),
|
|
||||||
sa.Column("last_accessed", sa.DateTime(), nullable=True),
|
|
||||||
sa.Column("last_updated", sa.DateTime(), nullable=True),
|
|
||||||
sa.PrimaryKeyConstraint("id"),
|
|
||||||
)
|
|
||||||
op.create_index("idx_accounts_institution_id", "accounts", ["institution_id"])
|
|
||||||
op.create_index("idx_accounts_status", "accounts", ["status"])
|
|
||||||
|
|
||||||
# Create balances table
|
|
||||||
op.create_table(
|
|
||||||
"balances",
|
|
||||||
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
|
|
||||||
sa.Column("account_id", sa.String(), nullable=False),
|
|
||||||
sa.Column("bank", sa.String(), nullable=False),
|
|
||||||
sa.Column("status", sa.String(), nullable=False),
|
|
||||||
sa.Column("iban", sa.String(), nullable=False),
|
|
||||||
sa.Column("amount", sa.Float(), nullable=False),
|
|
||||||
sa.Column("currency", sa.String(), nullable=False),
|
|
||||||
sa.Column("type", sa.String(), nullable=False),
|
|
||||||
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
|
||||||
sa.PrimaryKeyConstraint("id"),
|
|
||||||
)
|
|
||||||
op.create_index("idx_balances_account_id", "balances", ["account_id"])
|
|
||||||
op.create_index("idx_balances_timestamp", "balances", ["timestamp"])
|
|
||||||
op.create_index(
|
|
||||||
"idx_balances_account_type_timestamp",
|
|
||||||
"balances",
|
|
||||||
["account_id", "type", "timestamp"],
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create transactions table (old schema with internalTransactionId as PK)
|
|
||||||
op.create_table(
|
|
||||||
"transactions",
|
|
||||||
sa.Column("accountId", sa.String(), nullable=False),
|
|
||||||
sa.Column("transactionId", sa.String(), nullable=False),
|
|
||||||
sa.Column("internalTransactionId", sa.String(), nullable=True),
|
|
||||||
sa.Column("institutionId", sa.String(), nullable=False),
|
|
||||||
sa.Column("iban", sa.String(), nullable=True),
|
|
||||||
sa.Column("transactionDate", sa.DateTime(), nullable=True),
|
|
||||||
sa.Column("description", sa.String(), nullable=True),
|
|
||||||
sa.Column("transactionValue", sa.Float(), nullable=True),
|
|
||||||
sa.Column("transactionCurrency", sa.String(), nullable=True),
|
|
||||||
sa.Column("transactionStatus", sa.String(), nullable=True),
|
|
||||||
sa.Column("rawTransaction", sa.JSON(), nullable=False),
|
|
||||||
sa.PrimaryKeyConstraint("internalTransactionId"),
|
|
||||||
)
|
|
||||||
op.create_index(
|
|
||||||
"idx_transactions_internal_id", "transactions", ["internalTransactionId"]
|
|
||||||
)
|
|
||||||
op.create_index("idx_transactions_date", "transactions", ["transactionDate"])
|
|
||||||
op.create_index(
|
|
||||||
"idx_transactions_account_date",
|
|
||||||
"transactions",
|
|
||||||
["accountId", "transactionDate"],
|
|
||||||
)
|
|
||||||
op.create_index("idx_transactions_amount", "transactions", ["transactionValue"])
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade() -> None:
|
|
||||||
"""Drop initial tables."""
|
|
||||||
op.drop_table("transactions")
|
|
||||||
op.drop_table("balances")
|
|
||||||
op.drop_table("accounts")
|
|
||||||
@@ -1,59 +0,0 @@
|
|||||||
"""add_sync_operations_table
|
|
||||||
|
|
||||||
Add sync_operations table for tracking synchronization operations.
|
|
||||||
|
|
||||||
Revision ID: f854fd498a6e
|
|
||||||
Revises: be8d5807feca
|
|
||||||
Create Date: 2025-09-30 23:16:35.229062
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import Sequence, Union
|
|
||||||
|
|
||||||
import sqlalchemy as sa
|
|
||||||
|
|
||||||
from alembic import op
|
|
||||||
|
|
||||||
# revision identifiers, used by Alembic.
|
|
||||||
revision: str = "f854fd498a6e"
|
|
||||||
down_revision: Union[str, Sequence[str], None] = "be8d5807feca"
|
|
||||||
branch_labels: Union[str, Sequence[str], None] = None
|
|
||||||
depends_on: Union[str, Sequence[str], None] = None
|
|
||||||
|
|
||||||
|
|
||||||
def upgrade() -> None:
|
|
||||||
"""Create sync_operations table."""
|
|
||||||
op.create_table(
|
|
||||||
"sync_operations",
|
|
||||||
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
|
|
||||||
sa.Column("started_at", sa.DateTime(), nullable=False),
|
|
||||||
sa.Column("completed_at", sa.DateTime(), nullable=True),
|
|
||||||
sa.Column("success", sa.Boolean(), nullable=True),
|
|
||||||
sa.Column(
|
|
||||||
"accounts_processed", sa.Integer(), nullable=False, server_default="0"
|
|
||||||
),
|
|
||||||
sa.Column(
|
|
||||||
"transactions_added", sa.Integer(), nullable=False, server_default="0"
|
|
||||||
),
|
|
||||||
sa.Column(
|
|
||||||
"transactions_updated", sa.Integer(), nullable=False, server_default="0"
|
|
||||||
),
|
|
||||||
sa.Column("balances_updated", sa.Integer(), nullable=False, server_default="0"),
|
|
||||||
sa.Column("duration_seconds", sa.Float(), nullable=True),
|
|
||||||
sa.Column("errors", sa.String(), nullable=True),
|
|
||||||
sa.Column("logs", sa.String(), nullable=True),
|
|
||||||
sa.Column("trigger_type", sa.String(), nullable=False, server_default="manual"),
|
|
||||||
sa.PrimaryKeyConstraint("id"),
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create indexes
|
|
||||||
op.create_index("idx_sync_operations_started_at", "sync_operations", ["started_at"])
|
|
||||||
op.create_index("idx_sync_operations_success", "sync_operations", ["success"])
|
|
||||||
op.create_index(
|
|
||||||
"idx_sync_operations_trigger_type", "sync_operations", ["trigger_type"]
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def downgrade() -> None:
|
|
||||||
"""Drop sync_operations table."""
|
|
||||||
op.drop_table("sync_operations")
|
|
||||||
BIN
docs/leggen_demo.gif
Normal file
BIN
docs/leggen_demo.gif
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 548 KiB |
@@ -1,33 +1,102 @@
|
|||||||
server {
|
server {
|
||||||
|
|
||||||
|
# MIME types for PWA
|
||||||
|
include mime.types;
|
||||||
|
types {
|
||||||
|
application/manifest+json webmanifest;
|
||||||
|
}
|
||||||
|
|
||||||
listen 80;
|
listen 80;
|
||||||
server_name localhost;
|
server_name localhost;
|
||||||
root /usr/share/nginx/html;
|
root /usr/share/nginx/html;
|
||||||
index index.html;
|
index index.html;
|
||||||
|
|
||||||
|
# Trust proxy headers from Caddy/upstream proxy
|
||||||
|
set_real_ip_from 0.0.0.0/0;
|
||||||
|
real_ip_header X-Forwarded-For;
|
||||||
|
real_ip_recursive on;
|
||||||
|
|
||||||
|
# Security headers
|
||||||
|
add_header X-Content-Type-Options "nosniff" always;
|
||||||
|
add_header X-Frame-Options "SAMEORIGIN" always;
|
||||||
|
add_header X-XSS-Protection "1; mode=block" always;
|
||||||
|
add_header Referrer-Policy "strict-origin-when-cross-origin" always;
|
||||||
|
|
||||||
# Enable gzip compression
|
# Enable gzip compression
|
||||||
gzip on;
|
gzip on;
|
||||||
gzip_vary on;
|
gzip_vary on;
|
||||||
gzip_min_length 1024;
|
gzip_min_length 1024;
|
||||||
gzip_proxied expired no-cache no-store private auth;
|
gzip_proxied expired no-cache no-store private auth;
|
||||||
gzip_types text/plain text/css text/xml text/javascript application/javascript application/xml+rss application/json;
|
gzip_types text/plain text/css text/xml text/javascript application/javascript application/xml+rss application/json application/manifest+json image/svg+xml;
|
||||||
|
|
||||||
# Handle client-side routing
|
# Service worker - no cache, must revalidate
|
||||||
|
location ~ ^/(sw\.js|workbox-.*\.js)$ {
|
||||||
|
add_header Cache-Control "no-cache, no-store, must-revalidate" always;
|
||||||
|
add_header Pragma "no-cache" always;
|
||||||
|
add_header Expires "0" always;
|
||||||
|
add_header Service-Worker-Allowed "/" always;
|
||||||
|
types {
|
||||||
|
application/javascript js;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# PWA manifest - short cache with revalidation
|
||||||
|
location ~ ^/manifest\.webmanifest$ {
|
||||||
|
add_header Cache-Control "public, max-age=3600, must-revalidate" always;
|
||||||
|
types {
|
||||||
|
application/manifest+json webmanifest;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Handle client-side routing (SPA)
|
||||||
location / {
|
location / {
|
||||||
try_files $uri $uri/ /index.html;
|
autoindex off;
|
||||||
|
expires off;
|
||||||
|
add_header Cache-Control "public, max-age=0, s-maxage=0, must-revalidate" always;
|
||||||
|
try_files $uri $uri/ /index.html =404;
|
||||||
}
|
}
|
||||||
|
|
||||||
# API proxy to backend (configurable via API_BACKEND_URL env var)
|
# API proxy to backend (configurable via API_BACKEND_URL env var)
|
||||||
location /api/ {
|
location /api/ {
|
||||||
proxy_pass ${API_BACKEND_URL};
|
proxy_pass ${API_BACKEND_URL};
|
||||||
proxy_set_header Host $host;
|
proxy_set_header Host $http_host;
|
||||||
proxy_set_header X-Real-IP $remote_addr;
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
proxy_set_header X-Forwarded-Proto $scheme;
|
proxy_set_header X-Forwarded-Proto $http_x_forwarded_proto;
|
||||||
|
proxy_set_header X-Forwarded-Host $http_x_forwarded_host;
|
||||||
|
proxy_redirect off;
|
||||||
|
|
||||||
|
# Timeouts
|
||||||
|
proxy_connect_timeout 60s;
|
||||||
|
proxy_send_timeout 60s;
|
||||||
|
proxy_read_timeout 60s;
|
||||||
}
|
}
|
||||||
|
|
||||||
# Cache static assets
|
# Cache static assets with immutable flag
|
||||||
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg)$ {
|
location ~* \.(css|js)$ {
|
||||||
expires 1y;
|
expires 1y;
|
||||||
add_header Cache-Control "public, immutable";
|
add_header Cache-Control "public, immutable" always;
|
||||||
|
access_log off;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Cache images and icons
|
||||||
|
location ~* \.(png|jpg|jpeg|gif|ico|svg|webp)$ {
|
||||||
|
expires 1y;
|
||||||
|
add_header Cache-Control "public, immutable" always;
|
||||||
|
access_log off;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Cache fonts (if any are added later)
|
||||||
|
location ~* \.(woff|woff2|ttf|eot|otf)$ {
|
||||||
|
expires 1y;
|
||||||
|
add_header Cache-Control "public, immutable" always;
|
||||||
|
add_header Access-Control-Allow-Origin "*" always;
|
||||||
|
access_log off;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Other static files
|
||||||
|
location ~* \.(xml|txt)$ {
|
||||||
|
expires 1d;
|
||||||
|
add_header Cache-Control "public, must-revalidate" always;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -234,24 +234,28 @@ export default function AccountSettings() {
|
|||||||
}}
|
}}
|
||||||
autoFocus
|
autoFocus
|
||||||
/>
|
/>
|
||||||
<button
|
<Button
|
||||||
onClick={handleEditSave}
|
onClick={handleEditSave}
|
||||||
disabled={
|
disabled={
|
||||||
!editingName.trim() ||
|
!editingName.trim() ||
|
||||||
updateAccountMutation.isPending
|
updateAccountMutation.isPending
|
||||||
}
|
}
|
||||||
className="p-1 text-green-600 hover:text-green-700 disabled:opacity-50 disabled:cursor-not-allowed"
|
size="icon"
|
||||||
|
variant="ghost"
|
||||||
|
className="h-8 w-8 text-green-600 hover:text-green-700 hover:bg-green-100"
|
||||||
title="Save changes"
|
title="Save changes"
|
||||||
>
|
>
|
||||||
<Check className="h-4 w-4" />
|
<Check className="h-4 w-4" />
|
||||||
</button>
|
</Button>
|
||||||
<button
|
<Button
|
||||||
onClick={handleEditCancel}
|
onClick={handleEditCancel}
|
||||||
className="p-1 text-gray-600 hover:text-gray-700"
|
size="icon"
|
||||||
|
variant="ghost"
|
||||||
|
className="h-8 w-8"
|
||||||
title="Cancel editing"
|
title="Cancel editing"
|
||||||
>
|
>
|
||||||
<X className="h-4 w-4" />
|
<X className="h-4 w-4" />
|
||||||
</button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
<p className="text-sm text-muted-foreground truncate">
|
<p className="text-sm text-muted-foreground truncate">
|
||||||
{account.institution_id}
|
{account.institution_id}
|
||||||
@@ -265,13 +269,15 @@ export default function AccountSettings() {
|
|||||||
account.name ||
|
account.name ||
|
||||||
"Unnamed Account"}
|
"Unnamed Account"}
|
||||||
</h4>
|
</h4>
|
||||||
<button
|
<Button
|
||||||
onClick={() => handleEditStart(account)}
|
onClick={() => handleEditStart(account)}
|
||||||
className="flex-shrink-0 p-1 text-muted-foreground hover:text-foreground transition-colors"
|
size="icon"
|
||||||
|
variant="ghost"
|
||||||
|
className="h-7 w-7 flex-shrink-0"
|
||||||
title="Edit account name"
|
title="Edit account name"
|
||||||
>
|
>
|
||||||
<Edit2 className="h-4 w-4" />
|
<Edit2 className="h-4 w-4" />
|
||||||
</button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
<p className="text-sm text-muted-foreground truncate">
|
<p className="text-sm text-muted-foreground truncate">
|
||||||
{account.institution_id}
|
{account.institution_id}
|
||||||
|
|||||||
@@ -273,24 +273,28 @@ export default function AccountsOverview() {
|
|||||||
}}
|
}}
|
||||||
autoFocus
|
autoFocus
|
||||||
/>
|
/>
|
||||||
<button
|
<Button
|
||||||
onClick={handleEditSave}
|
onClick={handleEditSave}
|
||||||
disabled={
|
disabled={
|
||||||
!editingName.trim() ||
|
!editingName.trim() ||
|
||||||
updateAccountMutation.isPending
|
updateAccountMutation.isPending
|
||||||
}
|
}
|
||||||
className="p-1 text-green-600 hover:text-green-700 disabled:opacity-50 disabled:cursor-not-allowed"
|
size="icon"
|
||||||
|
variant="ghost"
|
||||||
|
className="h-8 w-8 text-green-600 hover:text-green-700 hover:bg-green-100"
|
||||||
title="Save changes"
|
title="Save changes"
|
||||||
>
|
>
|
||||||
<Check className="h-4 w-4" />
|
<Check className="h-4 w-4" />
|
||||||
</button>
|
</Button>
|
||||||
<button
|
<Button
|
||||||
onClick={handleEditCancel}
|
onClick={handleEditCancel}
|
||||||
className="p-1 text-gray-600 hover:text-gray-700"
|
size="icon"
|
||||||
|
variant="ghost"
|
||||||
|
className="h-8 w-8"
|
||||||
title="Cancel editing"
|
title="Cancel editing"
|
||||||
>
|
>
|
||||||
<X className="h-4 w-4" />
|
<X className="h-4 w-4" />
|
||||||
</button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
<p className="text-sm text-muted-foreground truncate">
|
<p className="text-sm text-muted-foreground truncate">
|
||||||
{account.institution_id}
|
{account.institution_id}
|
||||||
@@ -304,13 +308,15 @@ export default function AccountsOverview() {
|
|||||||
account.name ||
|
account.name ||
|
||||||
"Unnamed Account"}
|
"Unnamed Account"}
|
||||||
</h4>
|
</h4>
|
||||||
<button
|
<Button
|
||||||
onClick={() => handleEditStart(account)}
|
onClick={() => handleEditStart(account)}
|
||||||
className="flex-shrink-0 p-1 text-muted-foreground hover:text-foreground transition-colors"
|
size="icon"
|
||||||
|
variant="ghost"
|
||||||
|
className="h-7 w-7 flex-shrink-0"
|
||||||
title="Edit account name"
|
title="Edit account name"
|
||||||
>
|
>
|
||||||
<Edit2 className="h-4 w-4" />
|
<Edit2 className="h-4 w-4" />
|
||||||
</button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
<p className="text-sm text-muted-foreground truncate">
|
<p className="text-sm text-muted-foreground truncate">
|
||||||
{account.institution_id}
|
{account.institution_id}
|
||||||
|
|||||||
@@ -61,7 +61,7 @@ export function AppSidebar({ ...props }: React.ComponentProps<typeof Sidebar>) {
|
|||||||
};
|
};
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<Sidebar collapsible="icon" className="pt-safe-top pl-safe-left" {...props}>
|
<Sidebar collapsible="icon" {...props}>
|
||||||
<SidebarHeader>
|
<SidebarHeader>
|
||||||
<SidebarMenu>
|
<SidebarMenu>
|
||||||
<SidebarMenuItem>
|
<SidebarMenuItem>
|
||||||
|
|||||||
@@ -166,13 +166,15 @@ export default function NotificationFiltersDrawer({
|
|||||||
className="flex items-center space-x-1 bg-secondary text-secondary-foreground px-2 py-1 rounded-md text-sm"
|
className="flex items-center space-x-1 bg-secondary text-secondary-foreground px-2 py-1 rounded-md text-sm"
|
||||||
>
|
>
|
||||||
<span>{filter}</span>
|
<span>{filter}</span>
|
||||||
<button
|
<Button
|
||||||
type="button"
|
type="button"
|
||||||
onClick={() => removeCaseInsensitiveFilter(index)}
|
onClick={() => removeCaseInsensitiveFilter(index)}
|
||||||
className="text-secondary-foreground hover:text-foreground"
|
variant="ghost"
|
||||||
|
size="icon"
|
||||||
|
className="h-5 w-5 hover:bg-secondary-foreground/10"
|
||||||
>
|
>
|
||||||
<X className="h-3 w-3" />
|
<X className="h-3 w-3" />
|
||||||
</button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
))
|
))
|
||||||
) : (
|
) : (
|
||||||
@@ -222,13 +224,15 @@ export default function NotificationFiltersDrawer({
|
|||||||
className="flex items-center space-x-1 bg-secondary text-secondary-foreground px-2 py-1 rounded-md text-sm"
|
className="flex items-center space-x-1 bg-secondary text-secondary-foreground px-2 py-1 rounded-md text-sm"
|
||||||
>
|
>
|
||||||
<span>{filter}</span>
|
<span>{filter}</span>
|
||||||
<button
|
<Button
|
||||||
type="button"
|
type="button"
|
||||||
onClick={() => removeCaseSensitiveFilter(index)}
|
onClick={() => removeCaseSensitiveFilter(index)}
|
||||||
className="text-secondary-foreground hover:text-foreground"
|
variant="ghost"
|
||||||
|
size="icon"
|
||||||
|
className="h-5 w-5 hover:bg-secondary-foreground/10"
|
||||||
>
|
>
|
||||||
<X className="h-3 w-3" />
|
<X className="h-3 w-3" />
|
||||||
</button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
))
|
))
|
||||||
) : (
|
) : (
|
||||||
|
|||||||
@@ -1,160 +0,0 @@
|
|||||||
import { useEffect, useState } from "react";
|
|
||||||
import { X, Download, RotateCcw } from "lucide-react";
|
|
||||||
|
|
||||||
interface BeforeInstallPromptEvent extends Event {
|
|
||||||
prompt(): Promise<void>;
|
|
||||||
userChoice: Promise<{ outcome: "accepted" | "dismissed" }>;
|
|
||||||
}
|
|
||||||
|
|
||||||
interface PWAPromptProps {
|
|
||||||
onInstall?: () => void;
|
|
||||||
}
|
|
||||||
|
|
||||||
export function PWAInstallPrompt({ onInstall }: PWAPromptProps) {
|
|
||||||
const [deferredPrompt, setDeferredPrompt] =
|
|
||||||
useState<BeforeInstallPromptEvent | null>(null);
|
|
||||||
const [showPrompt, setShowPrompt] = useState(false);
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
const handler = (e: Event) => {
|
|
||||||
// Prevent the mini-infobar from appearing on mobile
|
|
||||||
e.preventDefault();
|
|
||||||
setDeferredPrompt(e as BeforeInstallPromptEvent);
|
|
||||||
setShowPrompt(true);
|
|
||||||
};
|
|
||||||
|
|
||||||
window.addEventListener("beforeinstallprompt", handler);
|
|
||||||
|
|
||||||
return () => window.removeEventListener("beforeinstallprompt", handler);
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
const handleInstall = async () => {
|
|
||||||
if (!deferredPrompt) return;
|
|
||||||
|
|
||||||
try {
|
|
||||||
await deferredPrompt.prompt();
|
|
||||||
const { outcome } = await deferredPrompt.userChoice;
|
|
||||||
|
|
||||||
if (outcome === "accepted") {
|
|
||||||
onInstall?.();
|
|
||||||
}
|
|
||||||
|
|
||||||
setDeferredPrompt(null);
|
|
||||||
setShowPrompt(false);
|
|
||||||
} catch (error) {
|
|
||||||
console.error("Error installing PWA:", error);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
const handleDismiss = () => {
|
|
||||||
setShowPrompt(false);
|
|
||||||
setDeferredPrompt(null);
|
|
||||||
};
|
|
||||||
|
|
||||||
if (!showPrompt || !deferredPrompt) return null;
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="fixed bottom-4 left-4 right-4 md:left-auto md:right-4 md:max-w-sm bg-white dark:bg-gray-800 border border-gray-200 dark:border-gray-700 rounded-lg shadow-lg p-4 z-50">
|
|
||||||
<div className="flex items-start gap-3">
|
|
||||||
<div className="flex-shrink-0">
|
|
||||||
<Download className="h-5 w-5 text-blue-600 dark:text-blue-400" />
|
|
||||||
</div>
|
|
||||||
<div className="flex-1 min-w-0">
|
|
||||||
<p className="text-sm font-medium text-gray-900 dark:text-gray-100">
|
|
||||||
Install Leggen
|
|
||||||
</p>
|
|
||||||
<p className="text-sm text-gray-500 dark:text-gray-400">
|
|
||||||
Add to your home screen for quick access
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
<button
|
|
||||||
onClick={handleDismiss}
|
|
||||||
className="flex-shrink-0 text-gray-400 hover:text-gray-500 dark:hover:text-gray-300"
|
|
||||||
>
|
|
||||||
<X className="h-4 w-4" />
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
<div className="mt-3 flex gap-2">
|
|
||||||
<button
|
|
||||||
onClick={handleInstall}
|
|
||||||
className="flex-1 bg-blue-600 text-white text-sm font-medium px-3 py-2 rounded-md hover:bg-blue-700 focus:outline-none focus:ring-2 focus:ring-blue-500"
|
|
||||||
>
|
|
||||||
Install
|
|
||||||
</button>
|
|
||||||
<button
|
|
||||||
onClick={handleDismiss}
|
|
||||||
className="px-3 py-2 text-sm font-medium text-gray-700 dark:text-gray-300 hover:text-gray-900 dark:hover:text-gray-100"
|
|
||||||
>
|
|
||||||
Not now
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
interface PWAUpdatePromptProps {
|
|
||||||
updateAvailable: boolean;
|
|
||||||
onUpdate: () => void;
|
|
||||||
}
|
|
||||||
|
|
||||||
export function PWAUpdatePrompt({
|
|
||||||
updateAvailable,
|
|
||||||
onUpdate,
|
|
||||||
}: PWAUpdatePromptProps) {
|
|
||||||
const [showPrompt, setShowPrompt] = useState(false);
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
if (updateAvailable) {
|
|
||||||
setShowPrompt(true);
|
|
||||||
}
|
|
||||||
}, [updateAvailable]);
|
|
||||||
|
|
||||||
const handleUpdate = () => {
|
|
||||||
onUpdate();
|
|
||||||
setShowPrompt(false);
|
|
||||||
};
|
|
||||||
|
|
||||||
const handleDismiss = () => {
|
|
||||||
setShowPrompt(false);
|
|
||||||
};
|
|
||||||
|
|
||||||
if (!showPrompt || !updateAvailable) return null;
|
|
||||||
|
|
||||||
return (
|
|
||||||
<div className="fixed top-4 left-4 right-4 md:left-auto md:right-4 md:max-w-sm bg-white dark:bg-gray-800 border border-gray-200 dark:border-gray-700 rounded-lg shadow-lg p-4 z-50">
|
|
||||||
<div className="flex items-start gap-3">
|
|
||||||
<div className="flex-shrink-0">
|
|
||||||
<RotateCcw className="h-5 w-5 text-green-600 dark:text-green-400" />
|
|
||||||
</div>
|
|
||||||
<div className="flex-1 min-w-0">
|
|
||||||
<p className="text-sm font-medium text-gray-900 dark:text-gray-100">
|
|
||||||
Update Available
|
|
||||||
</p>
|
|
||||||
<p className="text-sm text-gray-500 dark:text-gray-400">
|
|
||||||
A new version of Leggen is ready to install
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
<button
|
|
||||||
onClick={handleDismiss}
|
|
||||||
className="flex-shrink-0 text-gray-400 hover:text-gray-500 dark:hover:text-gray-300"
|
|
||||||
>
|
|
||||||
<X className="h-4 w-4" />
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
<div className="mt-3 flex gap-2">
|
|
||||||
<button
|
|
||||||
onClick={handleUpdate}
|
|
||||||
className="flex-1 bg-green-600 text-white text-sm font-medium px-3 py-2 rounded-md hover:bg-green-700 focus:outline-none focus:ring-2 focus:ring-green-500"
|
|
||||||
>
|
|
||||||
Update Now
|
|
||||||
</button>
|
|
||||||
<button
|
|
||||||
onClick={handleDismiss}
|
|
||||||
className="px-3 py-2 text-sm font-medium text-gray-700 dark:text-gray-300 hover:text-gray-900 dark:hover:text-gray-100"
|
|
||||||
>
|
|
||||||
Later
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
}
|
|
||||||
@@ -403,24 +403,28 @@ export default function Settings() {
|
|||||||
}}
|
}}
|
||||||
autoFocus
|
autoFocus
|
||||||
/>
|
/>
|
||||||
<button
|
<Button
|
||||||
onClick={handleEditSave}
|
onClick={handleEditSave}
|
||||||
disabled={
|
disabled={
|
||||||
!editingName.trim() ||
|
!editingName.trim() ||
|
||||||
updateAccountMutation.isPending
|
updateAccountMutation.isPending
|
||||||
}
|
}
|
||||||
className="p-1 text-green-600 hover:text-green-700 disabled:opacity-50 disabled:cursor-not-allowed"
|
size="icon"
|
||||||
|
variant="ghost"
|
||||||
|
className="h-8 w-8 text-green-600 hover:text-green-700 hover:bg-green-100"
|
||||||
title="Save changes"
|
title="Save changes"
|
||||||
>
|
>
|
||||||
<Check className="h-4 w-4" />
|
<Check className="h-4 w-4" />
|
||||||
</button>
|
</Button>
|
||||||
<button
|
<Button
|
||||||
onClick={handleEditCancel}
|
onClick={handleEditCancel}
|
||||||
className="p-1 text-gray-600 hover:text-gray-700"
|
size="icon"
|
||||||
|
variant="ghost"
|
||||||
|
className="h-8 w-8"
|
||||||
title="Cancel editing"
|
title="Cancel editing"
|
||||||
>
|
>
|
||||||
<X className="h-4 w-4" />
|
<X className="h-4 w-4" />
|
||||||
</button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
<p className="text-sm text-muted-foreground truncate">
|
<p className="text-sm text-muted-foreground truncate">
|
||||||
{account.institution_id}
|
{account.institution_id}
|
||||||
@@ -434,13 +438,15 @@ export default function Settings() {
|
|||||||
account.name ||
|
account.name ||
|
||||||
"Unnamed Account"}
|
"Unnamed Account"}
|
||||||
</h4>
|
</h4>
|
||||||
<button
|
<Button
|
||||||
onClick={() => handleEditStart(account)}
|
onClick={() => handleEditStart(account)}
|
||||||
className="flex-shrink-0 p-1 text-muted-foreground hover:text-foreground transition-colors"
|
size="icon"
|
||||||
|
variant="ghost"
|
||||||
|
className="h-7 w-7 flex-shrink-0"
|
||||||
title="Edit account name"
|
title="Edit account name"
|
||||||
>
|
>
|
||||||
<Edit2 className="h-4 w-4" />
|
<Edit2 className="h-4 w-4" />
|
||||||
</button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
<p className="text-sm text-muted-foreground truncate">
|
<p className="text-sm text-muted-foreground truncate">
|
||||||
{account.institution_id}
|
{account.institution_id}
|
||||||
@@ -579,7 +585,7 @@ export default function Settings() {
|
|||||||
Created {formatDate(connection.created_at)}
|
Created {formatDate(connection.created_at)}
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
<button
|
<Button
|
||||||
onClick={() => {
|
onClick={() => {
|
||||||
const isWorking =
|
const isWorking =
|
||||||
connection.status.toLowerCase() === "ln";
|
connection.status.toLowerCase() === "ln";
|
||||||
@@ -594,11 +600,13 @@ export default function Settings() {
|
|||||||
}
|
}
|
||||||
}}
|
}}
|
||||||
disabled={deleteBankConnectionMutation.isPending}
|
disabled={deleteBankConnectionMutation.isPending}
|
||||||
className="p-1 text-muted-foreground hover:text-destructive transition-colors disabled:opacity-50 disabled:cursor-not-allowed"
|
size="icon"
|
||||||
|
variant="ghost"
|
||||||
|
className="h-8 w-8 text-muted-foreground hover:text-destructive"
|
||||||
title="Delete connection"
|
title="Delete connection"
|
||||||
>
|
>
|
||||||
<Trash2 className="h-4 w-4" />
|
<Trash2 className="h-4 w-4" />
|
||||||
</button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -31,7 +31,7 @@ export function SiteHeader() {
|
|||||||
});
|
});
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<header className="flex h-16 shrink-0 items-center gap-2 border-b transition-[width,height] ease-linear pt-safe-top">
|
<header className="flex h-16 shrink-0 items-center gap-2 border-b transition-[width,height] ease-linear">
|
||||||
<div className="flex w-full items-center gap-1 px-4 lg:gap-2 lg:px-6">
|
<div className="flex w-full items-center gap-1 px-4 lg:gap-2 lg:px-6">
|
||||||
<SidebarTrigger className="-ml-1" />
|
<SidebarTrigger className="-ml-1" />
|
||||||
<Separator
|
<Separator
|
||||||
|
|||||||
@@ -259,14 +259,15 @@ export default function TransactionsTable() {
|
|||||||
cell: ({ row }) => {
|
cell: ({ row }) => {
|
||||||
const transaction = row.original;
|
const transaction = row.original;
|
||||||
return (
|
return (
|
||||||
<button
|
<Button
|
||||||
onClick={() => handleViewRaw(transaction)}
|
onClick={() => handleViewRaw(transaction)}
|
||||||
className="inline-flex items-center px-2 py-1 text-xs bg-muted text-muted-foreground rounded hover:bg-accent transition-colors"
|
variant="ghost"
|
||||||
|
size="sm"
|
||||||
title="View raw transaction data"
|
title="View raw transaction data"
|
||||||
>
|
>
|
||||||
<Eye className="h-3 w-3 mr-1" />
|
<Eye className="h-3 w-3 mr-1" />
|
||||||
Raw
|
Raw
|
||||||
</button>
|
</Button>
|
||||||
);
|
);
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
@@ -530,14 +531,15 @@ export default function TransactionsTable() {
|
|||||||
transaction.transaction_currency,
|
transaction.transaction_currency,
|
||||||
)}
|
)}
|
||||||
</p>
|
</p>
|
||||||
<button
|
<Button
|
||||||
onClick={() => handleViewRaw(transaction)}
|
onClick={() => handleViewRaw(transaction)}
|
||||||
className="inline-flex items-center px-2 py-1 text-xs bg-muted text-muted-foreground rounded hover:bg-accent transition-colors"
|
variant="ghost"
|
||||||
|
size="sm"
|
||||||
title="View raw transaction data"
|
title="View raw transaction data"
|
||||||
>
|
>
|
||||||
<Eye className="h-3 w-3 mr-1" />
|
<Eye className="h-3 w-3 mr-1" />
|
||||||
Raw
|
Raw
|
||||||
</button>
|
</Button>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -1,72 +0,0 @@
|
|||||||
import { useEffect, useState } from "react";
|
|
||||||
|
|
||||||
interface PWAUpdate {
|
|
||||||
updateAvailable: boolean;
|
|
||||||
updateSW: () => Promise<void>;
|
|
||||||
forceReload: () => Promise<void>;
|
|
||||||
}
|
|
||||||
|
|
||||||
export function usePWA(): PWAUpdate {
|
|
||||||
const [updateAvailable, setUpdateAvailable] = useState(false);
|
|
||||||
const [updateSW, setUpdateSW] = useState<() => Promise<void>>(
|
|
||||||
() => async () => {},
|
|
||||||
);
|
|
||||||
|
|
||||||
const forceReload = async (): Promise<void> => {
|
|
||||||
try {
|
|
||||||
// Clear all caches
|
|
||||||
if ("caches" in window) {
|
|
||||||
const cacheNames = await caches.keys();
|
|
||||||
await Promise.all(
|
|
||||||
cacheNames.map((cacheName) => caches.delete(cacheName)),
|
|
||||||
);
|
|
||||||
console.log("All caches cleared");
|
|
||||||
}
|
|
||||||
|
|
||||||
// Unregister service worker
|
|
||||||
if ("serviceWorker" in navigator) {
|
|
||||||
const registrations = await navigator.serviceWorker.getRegistrations();
|
|
||||||
await Promise.all(
|
|
||||||
registrations.map((registration) => registration.unregister()),
|
|
||||||
);
|
|
||||||
console.log("All service workers unregistered");
|
|
||||||
}
|
|
||||||
|
|
||||||
// Force reload
|
|
||||||
window.location.reload();
|
|
||||||
} catch (error) {
|
|
||||||
console.error("Error during force reload:", error);
|
|
||||||
// Fallback: just reload the page
|
|
||||||
window.location.reload();
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
// Check if SW registration is available
|
|
||||||
if ("serviceWorker" in navigator) {
|
|
||||||
// Import the registerSW function
|
|
||||||
import("virtual:pwa-register")
|
|
||||||
.then(({ registerSW }) => {
|
|
||||||
const updateSWFunction = registerSW({
|
|
||||||
onNeedRefresh() {
|
|
||||||
setUpdateAvailable(true);
|
|
||||||
setUpdateSW(() => updateSWFunction);
|
|
||||||
},
|
|
||||||
onOfflineReady() {
|
|
||||||
console.log("App ready to work offline");
|
|
||||||
},
|
|
||||||
});
|
|
||||||
})
|
|
||||||
.catch(() => {
|
|
||||||
// PWA not available in development mode or when disabled
|
|
||||||
console.log("PWA registration not available");
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}, []);
|
|
||||||
|
|
||||||
return {
|
|
||||||
updateAvailable,
|
|
||||||
updateSW,
|
|
||||||
forceReload,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
@@ -1,39 +0,0 @@
|
|||||||
import { useEffect } from "react";
|
|
||||||
import { useQuery } from "@tanstack/react-query";
|
|
||||||
import { apiClient } from "../lib/api";
|
|
||||||
|
|
||||||
const VERSION_STORAGE_KEY = "leggen_app_version";
|
|
||||||
|
|
||||||
export function useVersionCheck(forceReload: () => Promise<void>) {
|
|
||||||
const { data: healthStatus, isSuccess: healthSuccess } = useQuery({
|
|
||||||
queryKey: ["health"],
|
|
||||||
queryFn: apiClient.getHealth,
|
|
||||||
refetchInterval: 30000,
|
|
||||||
retry: false,
|
|
||||||
staleTime: 0, // Always consider data stale to ensure fresh version checks
|
|
||||||
});
|
|
||||||
|
|
||||||
useEffect(() => {
|
|
||||||
if (healthSuccess && healthStatus?.version) {
|
|
||||||
const currentVersion = healthStatus.version;
|
|
||||||
const storedVersion = localStorage.getItem(VERSION_STORAGE_KEY);
|
|
||||||
|
|
||||||
if (storedVersion && storedVersion !== currentVersion) {
|
|
||||||
console.log(
|
|
||||||
`Version mismatch detected: stored=${storedVersion}, current=${currentVersion}`,
|
|
||||||
);
|
|
||||||
console.log("Clearing cache and reloading...");
|
|
||||||
|
|
||||||
// Update stored version first
|
|
||||||
localStorage.setItem(VERSION_STORAGE_KEY, currentVersion);
|
|
||||||
|
|
||||||
// Force reload to clear cache
|
|
||||||
forceReload();
|
|
||||||
} else if (!storedVersion) {
|
|
||||||
// First time loading, store the version
|
|
||||||
localStorage.setItem(VERSION_STORAGE_KEY, currentVersion);
|
|
||||||
console.log(`Version stored: ${currentVersion}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}, [healthSuccess, healthStatus?.version, forceReload]);
|
|
||||||
}
|
|
||||||
@@ -86,5 +86,9 @@
|
|||||||
}
|
}
|
||||||
body {
|
body {
|
||||||
@apply bg-background text-foreground;
|
@apply bg-background text-foreground;
|
||||||
|
padding-top: var(--safe-area-inset-top);
|
||||||
|
padding-bottom: var(--safe-area-inset-bottom);
|
||||||
|
padding-left: var(--safe-area-inset-left);
|
||||||
|
padding-right: var(--safe-area-inset-right);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -5,6 +5,7 @@ import { QueryClient, QueryClientProvider } from "@tanstack/react-query";
|
|||||||
import { ThemeProvider } from "./contexts/ThemeContext";
|
import { ThemeProvider } from "./contexts/ThemeContext";
|
||||||
import "./index.css";
|
import "./index.css";
|
||||||
import { routeTree } from "./routeTree.gen";
|
import { routeTree } from "./routeTree.gen";
|
||||||
|
import { registerSW } from "virtual:pwa-register";
|
||||||
|
|
||||||
const router = createRouter({ routeTree });
|
const router = createRouter({ routeTree });
|
||||||
|
|
||||||
@@ -17,6 +18,57 @@ const queryClient = new QueryClient({
|
|||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
|
const intervalMS = 60 * 60 * 1000;
|
||||||
|
|
||||||
|
registerSW({
|
||||||
|
onRegisteredSW(swUrl, r) {
|
||||||
|
console.log("[PWA] Service worker registered successfully");
|
||||||
|
|
||||||
|
if (r) {
|
||||||
|
setInterval(async () => {
|
||||||
|
console.log("[PWA] Checking for updates...");
|
||||||
|
|
||||||
|
if (r.installing) {
|
||||||
|
console.log("[PWA] Update already installing, skipping check");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!navigator) {
|
||||||
|
console.log("[PWA] Navigator not available, skipping check");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if ("connection" in navigator && !navigator.onLine) {
|
||||||
|
console.log("[PWA] Device is offline, skipping check");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const resp = await fetch(swUrl, {
|
||||||
|
cache: "no-store",
|
||||||
|
headers: {
|
||||||
|
cache: "no-store",
|
||||||
|
"cache-control": "no-cache",
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (resp?.status === 200) {
|
||||||
|
console.log("[PWA] Update check successful, triggering update");
|
||||||
|
await r.update();
|
||||||
|
} else {
|
||||||
|
console.log(`[PWA] Update check returned status: ${resp?.status}`);
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error("[PWA] Error checking for updates:", error);
|
||||||
|
}
|
||||||
|
}, intervalMS);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
onOfflineReady() {
|
||||||
|
console.log("[PWA] App ready to work offline");
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
createRoot(document.getElementById("root")!).render(
|
createRoot(document.getElementById("root")!).render(
|
||||||
<StrictMode>
|
<StrictMode>
|
||||||
<QueryClientProvider client={queryClient}>
|
<QueryClientProvider client={queryClient}>
|
||||||
|
|||||||
@@ -1,31 +1,10 @@
|
|||||||
import { createRootRoute, Outlet } from "@tanstack/react-router";
|
import { createRootRoute, Outlet } from "@tanstack/react-router";
|
||||||
import { AppSidebar } from "../components/AppSidebar";
|
import { AppSidebar } from "../components/AppSidebar";
|
||||||
import { SiteHeader } from "../components/SiteHeader";
|
import { SiteHeader } from "../components/SiteHeader";
|
||||||
import { PWAInstallPrompt, PWAUpdatePrompt } from "../components/PWAPrompts";
|
|
||||||
import { usePWA } from "../hooks/usePWA";
|
|
||||||
import { useVersionCheck } from "../hooks/useVersionCheck";
|
|
||||||
import { SidebarInset, SidebarProvider } from "../components/ui/sidebar";
|
import { SidebarInset, SidebarProvider } from "../components/ui/sidebar";
|
||||||
import { Toaster } from "../components/ui/sonner";
|
import { Toaster } from "../components/ui/sonner";
|
||||||
|
|
||||||
function RootLayout() {
|
function RootLayout() {
|
||||||
const { updateAvailable, updateSW, forceReload } = usePWA();
|
|
||||||
|
|
||||||
// Check for version mismatches and force reload if needed
|
|
||||||
useVersionCheck(forceReload);
|
|
||||||
|
|
||||||
const handlePWAInstall = () => {
|
|
||||||
console.log("PWA installed successfully");
|
|
||||||
};
|
|
||||||
|
|
||||||
const handlePWAUpdate = async () => {
|
|
||||||
try {
|
|
||||||
await updateSW();
|
|
||||||
console.log("PWA updated successfully");
|
|
||||||
} catch (error) {
|
|
||||||
console.error("Error updating PWA:", error);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<SidebarProvider
|
<SidebarProvider
|
||||||
style={
|
style={
|
||||||
@@ -43,13 +22,6 @@ function RootLayout() {
|
|||||||
</main>
|
</main>
|
||||||
</SidebarInset>
|
</SidebarInset>
|
||||||
|
|
||||||
{/* PWA Prompts */}
|
|
||||||
<PWAInstallPrompt onInstall={handlePWAInstall} />
|
|
||||||
<PWAUpdatePrompt
|
|
||||||
updateAvailable={updateAvailable}
|
|
||||||
onUpdate={handlePWAUpdate}
|
|
||||||
/>
|
|
||||||
|
|
||||||
{/* Toast Notifications */}
|
{/* Toast Notifications */}
|
||||||
<Toaster />
|
<Toaster />
|
||||||
</SidebarProvider>
|
</SidebarProvider>
|
||||||
|
|||||||
@@ -11,10 +11,7 @@ export default defineConfig({
|
|||||||
VitePWA({
|
VitePWA({
|
||||||
registerType: "autoUpdate",
|
registerType: "autoUpdate",
|
||||||
includeAssets: [
|
includeAssets: [
|
||||||
"favicon.ico",
|
"robots.txt"
|
||||||
"apple-touch-icon-180x180.png",
|
|
||||||
"maskable-icon-512x512.png",
|
|
||||||
"robots.txt",
|
|
||||||
],
|
],
|
||||||
manifest: {
|
manifest: {
|
||||||
name: "Leggen",
|
name: "Leggen",
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ import click
|
|||||||
|
|
||||||
from leggen.api_client import LeggenAPIClient
|
from leggen.api_client import LeggenAPIClient
|
||||||
from leggen.main import cli
|
from leggen.main import cli
|
||||||
|
from leggen.utils.disk import save_file
|
||||||
from leggen.utils.text import info, print_table, success, warning
|
from leggen.utils.text import info, print_table, success, warning
|
||||||
|
|
||||||
|
|
||||||
@@ -62,6 +63,9 @@ def add(ctx):
|
|||||||
# Connect to bank via API
|
# Connect to bank via API
|
||||||
result = api_client.connect_to_bank(bank_id, "http://localhost:8000/")
|
result = api_client.connect_to_bank(bank_id, "http://localhost:8000/")
|
||||||
|
|
||||||
|
# Save requisition details
|
||||||
|
save_file(f"req_{result['id']}.json", result)
|
||||||
|
|
||||||
success("Bank connection request created successfully!")
|
success("Bank connection request created successfully!")
|
||||||
warning(
|
warning(
|
||||||
"Please open the following URL in your browser to complete the authorization:"
|
"Please open the following URL in your browser to complete the authorization:"
|
||||||
|
|||||||
@@ -60,6 +60,8 @@ def create_app() -> FastAPI:
|
|||||||
description="Open Banking API for Leggen",
|
description="Open Banking API for Leggen",
|
||||||
version=version,
|
version=version,
|
||||||
lifespan=lifespan,
|
lifespan=lifespan,
|
||||||
|
docs_url="/api/v1/docs",
|
||||||
|
openapi_url="/api/v1/openapi.json",
|
||||||
)
|
)
|
||||||
|
|
||||||
# Add CORS middleware
|
# Add CORS middleware
|
||||||
|
|||||||
@@ -1,93 +0,0 @@
|
|||||||
"""SQLModel database models for Leggen."""
|
|
||||||
|
|
||||||
from datetime import datetime
|
|
||||||
from typing import Optional
|
|
||||||
|
|
||||||
from sqlmodel import JSON, Column, Field, SQLModel
|
|
||||||
|
|
||||||
|
|
||||||
class Account(SQLModel, table=True):
|
|
||||||
"""Account model."""
|
|
||||||
|
|
||||||
__tablename__ = "accounts"
|
|
||||||
|
|
||||||
id: str = Field(primary_key=True)
|
|
||||||
institution_id: str = Field(index=True)
|
|
||||||
status: str = Field(index=True)
|
|
||||||
iban: Optional[str] = None
|
|
||||||
name: Optional[str] = None
|
|
||||||
currency: Optional[str] = None
|
|
||||||
created: datetime
|
|
||||||
last_accessed: Optional[datetime] = None
|
|
||||||
last_updated: Optional[datetime] = None
|
|
||||||
display_name: Optional[str] = None
|
|
||||||
logo: Optional[str] = None
|
|
||||||
|
|
||||||
|
|
||||||
class Balance(SQLModel, table=True):
|
|
||||||
"""Balance model."""
|
|
||||||
|
|
||||||
__tablename__ = "balances"
|
|
||||||
|
|
||||||
id: Optional[int] = Field(default=None, primary_key=True)
|
|
||||||
account_id: str = Field(index=True)
|
|
||||||
bank: str
|
|
||||||
status: str
|
|
||||||
iban: str
|
|
||||||
amount: float
|
|
||||||
currency: str
|
|
||||||
type: str
|
|
||||||
timestamp: datetime = Field(index=True)
|
|
||||||
|
|
||||||
|
|
||||||
class Transaction(SQLModel, table=True):
|
|
||||||
"""Transaction model."""
|
|
||||||
|
|
||||||
__tablename__ = "transactions"
|
|
||||||
|
|
||||||
accountId: str = Field(primary_key=True)
|
|
||||||
transactionId: str = Field(primary_key=True)
|
|
||||||
internalTransactionId: Optional[str] = Field(default=None, index=True)
|
|
||||||
institutionId: str
|
|
||||||
iban: Optional[str] = None
|
|
||||||
transactionDate: Optional[datetime] = Field(default=None, index=True)
|
|
||||||
description: Optional[str] = None
|
|
||||||
transactionValue: Optional[float] = Field(default=None, index=True)
|
|
||||||
transactionCurrency: Optional[str] = None
|
|
||||||
transactionStatus: Optional[str] = None
|
|
||||||
rawTransaction: dict = Field(sa_column=Column(JSON))
|
|
||||||
|
|
||||||
|
|
||||||
class TransactionEnrichment(SQLModel, table=True):
|
|
||||||
"""Transaction enrichment model."""
|
|
||||||
|
|
||||||
__tablename__ = "transaction_enrichments"
|
|
||||||
|
|
||||||
accountId: str = Field(primary_key=True, foreign_key="transactions.accountId")
|
|
||||||
transactionId: str = Field(
|
|
||||||
primary_key=True, foreign_key="transactions.transactionId"
|
|
||||||
)
|
|
||||||
clean_name: Optional[str] = Field(default=None, index=True)
|
|
||||||
category: Optional[str] = Field(default=None, index=True)
|
|
||||||
logo_url: Optional[str] = None
|
|
||||||
created_at: datetime
|
|
||||||
updated_at: datetime
|
|
||||||
|
|
||||||
|
|
||||||
class SyncOperation(SQLModel, table=True):
|
|
||||||
"""Sync operation model."""
|
|
||||||
|
|
||||||
__tablename__ = "sync_operations"
|
|
||||||
|
|
||||||
id: Optional[int] = Field(default=None, primary_key=True)
|
|
||||||
started_at: datetime = Field(index=True)
|
|
||||||
completed_at: Optional[datetime] = None
|
|
||||||
success: Optional[bool] = Field(default=None, index=True)
|
|
||||||
accounts_processed: int = Field(default=0)
|
|
||||||
transactions_added: int = Field(default=0)
|
|
||||||
transactions_updated: int = Field(default=0)
|
|
||||||
balances_updated: int = Field(default=0)
|
|
||||||
duration_seconds: Optional[float] = None
|
|
||||||
errors: Optional[str] = None
|
|
||||||
logs: Optional[str] = None
|
|
||||||
trigger_type: str = Field(default="manual", index=True)
|
|
||||||
@@ -1,65 +0,0 @@
|
|||||||
"""Database connection and session management using SQLModel."""
|
|
||||||
|
|
||||||
from contextlib import contextmanager
|
|
||||||
from typing import Generator
|
|
||||||
|
|
||||||
from loguru import logger
|
|
||||||
from sqlalchemy import create_engine
|
|
||||||
from sqlalchemy.pool import StaticPool
|
|
||||||
from sqlmodel import Session, SQLModel
|
|
||||||
|
|
||||||
from leggen.utils.paths import path_manager
|
|
||||||
|
|
||||||
_engine = None
|
|
||||||
|
|
||||||
|
|
||||||
def get_database_url() -> str:
|
|
||||||
"""Get the database URL for SQLAlchemy."""
|
|
||||||
db_path = path_manager.get_database_path()
|
|
||||||
return f"sqlite:///{db_path}"
|
|
||||||
|
|
||||||
|
|
||||||
def get_engine():
|
|
||||||
"""Get or create the database engine."""
|
|
||||||
global _engine
|
|
||||||
if _engine is None:
|
|
||||||
db_url = get_database_url()
|
|
||||||
_engine = create_engine(
|
|
||||||
db_url,
|
|
||||||
connect_args={"check_same_thread": False},
|
|
||||||
poolclass=StaticPool,
|
|
||||||
echo=False,
|
|
||||||
)
|
|
||||||
return _engine
|
|
||||||
|
|
||||||
|
|
||||||
def create_db_and_tables():
|
|
||||||
"""Create all database tables."""
|
|
||||||
engine = get_engine()
|
|
||||||
SQLModel.metadata.create_all(engine)
|
|
||||||
logger.info("Database tables created/verified")
|
|
||||||
|
|
||||||
|
|
||||||
@contextmanager
|
|
||||||
def get_session() -> Generator[Session, None, None]:
|
|
||||||
"""Get a database session context manager.
|
|
||||||
|
|
||||||
Usage:
|
|
||||||
with get_session() as session:
|
|
||||||
result = session.exec(select(Account)).all()
|
|
||||||
"""
|
|
||||||
session = Session(get_engine())
|
|
||||||
try:
|
|
||||||
yield session
|
|
||||||
session.commit()
|
|
||||||
except Exception as e:
|
|
||||||
session.rollback()
|
|
||||||
logger.error(f"Database session error: {e}")
|
|
||||||
raise
|
|
||||||
finally:
|
|
||||||
session.close()
|
|
||||||
|
|
||||||
|
|
||||||
def init_database():
|
|
||||||
"""Initialize the database with tables."""
|
|
||||||
create_db_and_tables()
|
|
||||||
@@ -1,64 +0,0 @@
|
|||||||
"""Database helper utilities for Leggen - Compatibility layer."""
|
|
||||||
|
|
||||||
import sqlite3
|
|
||||||
from contextlib import contextmanager
|
|
||||||
from pathlib import Path
|
|
||||||
from typing import Any, Generator
|
|
||||||
|
|
||||||
from loguru import logger
|
|
||||||
|
|
||||||
|
|
||||||
@contextmanager
|
|
||||||
def get_db_connection(db_path: Path) -> Generator[sqlite3.Connection, None, None]:
|
|
||||||
"""Context manager for database connections.
|
|
||||||
|
|
||||||
Usage:
|
|
||||||
with get_db_connection(db_path) as conn:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
cursor.execute(...)
|
|
||||||
conn.commit()
|
|
||||||
"""
|
|
||||||
conn = None
|
|
||||||
try:
|
|
||||||
conn = sqlite3.connect(str(db_path))
|
|
||||||
conn.row_factory = sqlite3.Row # Enable dict-like access
|
|
||||||
yield conn
|
|
||||||
except Exception as e:
|
|
||||||
if conn:
|
|
||||||
conn.rollback()
|
|
||||||
logger.error(f"Database error: {e}")
|
|
||||||
raise
|
|
||||||
finally:
|
|
||||||
if conn:
|
|
||||||
conn.close()
|
|
||||||
|
|
||||||
|
|
||||||
def execute_query(
|
|
||||||
db_path: Path, query: str, params: tuple = ()
|
|
||||||
) -> list[dict[str, Any]]:
|
|
||||||
"""Execute a SELECT query and return results as list of dicts."""
|
|
||||||
with get_db_connection(db_path) as conn:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
cursor.execute(query, params)
|
|
||||||
rows = cursor.fetchall()
|
|
||||||
return [dict(row) for row in rows]
|
|
||||||
|
|
||||||
|
|
||||||
def execute_single(
|
|
||||||
db_path: Path, query: str, params: tuple = ()
|
|
||||||
) -> dict[str, Any] | None:
|
|
||||||
"""Execute a SELECT query and return a single result as dict or None."""
|
|
||||||
with get_db_connection(db_path) as conn:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
cursor.execute(query, params)
|
|
||||||
row = cursor.fetchone()
|
|
||||||
return dict(row) if row else None
|
|
||||||
|
|
||||||
|
|
||||||
def execute_count(db_path: Path, query: str, params: tuple = ()) -> int:
|
|
||||||
"""Execute a COUNT query and return the integer result."""
|
|
||||||
with get_db_connection(db_path) as conn:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
cursor.execute(query, params)
|
|
||||||
result = cursor.fetchone()
|
|
||||||
return result[0] if result else 0
|
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -41,7 +41,9 @@ class GoCardlessService:
|
|||||||
headers = await self._get_auth_headers()
|
headers = await self._get_auth_headers()
|
||||||
|
|
||||||
async with httpx.AsyncClient() as client:
|
async with httpx.AsyncClient() as client:
|
||||||
response = await client.request(method, url, headers=headers, **kwargs)
|
response = await client.request(
|
||||||
|
method, url, headers=headers, timeout=30, **kwargs
|
||||||
|
)
|
||||||
_log_rate_limits(response, method, url)
|
_log_rate_limits(response, method, url)
|
||||||
|
|
||||||
# If we get 401, clear token cache and retry once
|
# If we get 401, clear token cache and retry once
|
||||||
@@ -49,7 +51,9 @@ class GoCardlessService:
|
|||||||
logger.warning("Got 401, clearing token cache and retrying")
|
logger.warning("Got 401, clearing token cache and retrying")
|
||||||
self._token = None
|
self._token = None
|
||||||
headers = await self._get_auth_headers()
|
headers = await self._get_auth_headers()
|
||||||
response = await client.request(method, url, headers=headers, **kwargs)
|
response = await client.request(
|
||||||
|
method, url, headers=headers, timeout=30, **kwargs
|
||||||
|
)
|
||||||
_log_rate_limits(response, method, url)
|
_log_rate_limits(response, method, url)
|
||||||
|
|
||||||
response.raise_for_status()
|
response.raise_for_status()
|
||||||
|
|||||||
@@ -70,6 +70,9 @@ class TransactionProcessor:
|
|||||||
internal_transaction_id = transaction.get("internalTransactionId")
|
internal_transaction_id = transaction.get("internalTransactionId")
|
||||||
|
|
||||||
if not transaction_id:
|
if not transaction_id:
|
||||||
|
if internal_transaction_id:
|
||||||
|
transaction_id = internal_transaction_id
|
||||||
|
else:
|
||||||
raise ValueError("Transaction missing required transactionId field")
|
raise ValueError("Transaction missing required transactionId field")
|
||||||
|
|
||||||
return {
|
return {
|
||||||
|
|||||||
35
leggen/utils/disk.py
Normal file
35
leggen/utils/disk.py
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
import json
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
import click
|
||||||
|
|
||||||
|
from leggen.utils.text import error, info
|
||||||
|
|
||||||
|
|
||||||
|
def save_file(name: str, d: dict):
|
||||||
|
Path.mkdir(Path(click.get_app_dir("leggen")), exist_ok=True)
|
||||||
|
config_file = click.get_app_dir("leggen") / Path(name)
|
||||||
|
|
||||||
|
with click.open_file(str(config_file), "w") as f:
|
||||||
|
json.dump(d, f)
|
||||||
|
info(f"Wrote configuration file at '{config_file}'")
|
||||||
|
|
||||||
|
|
||||||
|
def load_file(name: str) -> dict:
|
||||||
|
config_file = click.get_app_dir("leggen") / Path(name)
|
||||||
|
try:
|
||||||
|
with click.open_file(str(config_file), "r") as f:
|
||||||
|
config = json.load(f)
|
||||||
|
return config
|
||||||
|
except FileNotFoundError:
|
||||||
|
error(f"Configuration file '{config_file}' not found")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
|
||||||
|
def get_prefixed_files(prefix: str) -> list:
|
||||||
|
return [
|
||||||
|
f.name
|
||||||
|
for f in Path(click.get_app_dir("leggen")).iterdir()
|
||||||
|
if f.name.startswith(prefix)
|
||||||
|
]
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
[project]
|
[project]
|
||||||
name = "leggen"
|
name = "leggen"
|
||||||
version = "2025.9.26"
|
version = "2025.11.0"
|
||||||
description = "An Open Banking CLI"
|
description = "An Open Banking CLI"
|
||||||
authors = [{ name = "Elisiário Couto", email = "elisiario@couto.io" }]
|
authors = [{ name = "Elisiário Couto", email = "elisiario@couto.io" }]
|
||||||
requires-python = "~=3.13.0"
|
requires-python = "~=3.13.0"
|
||||||
@@ -36,8 +36,6 @@ dependencies = [
|
|||||||
"httpx>=0.28.1",
|
"httpx>=0.28.1",
|
||||||
"pydantic>=2.0.0,<3",
|
"pydantic>=2.0.0,<3",
|
||||||
"boto3>=1.35.0,<2",
|
"boto3>=1.35.0,<2",
|
||||||
"sqlmodel>=0.0.25",
|
|
||||||
"alembic>=1.16.5",
|
|
||||||
]
|
]
|
||||||
|
|
||||||
[project.urls]
|
[project.urls]
|
||||||
|
|||||||
@@ -106,11 +106,6 @@ class TestConfigurablePaths:
|
|||||||
# Set custom database path
|
# Set custom database path
|
||||||
path_manager.set_database_path(test_db_path)
|
path_manager.set_database_path(test_db_path)
|
||||||
|
|
||||||
# Initialize database tables for the custom path
|
|
||||||
from leggen.services.database import init_database
|
|
||||||
|
|
||||||
init_database()
|
|
||||||
|
|
||||||
# Test database operations using DatabaseService
|
# Test database operations using DatabaseService
|
||||||
database_service = DatabaseService()
|
database_service = DatabaseService()
|
||||||
balance_data = {
|
balance_data = {
|
||||||
|
|||||||
@@ -1,54 +1,16 @@
|
|||||||
"""Tests for database service."""
|
"""Tests for database service."""
|
||||||
|
|
||||||
import tempfile
|
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from pathlib import Path
|
|
||||||
from unittest.mock import patch
|
from unittest.mock import patch
|
||||||
|
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
from leggen.services.database import init_database
|
|
||||||
from leggen.services.database_service import DatabaseService
|
from leggen.services.database_service import DatabaseService
|
||||||
from leggen.utils.paths import path_manager
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
def test_db_path():
|
def database_service():
|
||||||
"""Create a temporary test database."""
|
"""Create a database service instance for testing."""
|
||||||
import os
|
|
||||||
|
|
||||||
# Create a writable temporary file
|
|
||||||
fd, temp_path = tempfile.mkstemp(suffix=".db")
|
|
||||||
os.close(fd) # Close the file descriptor
|
|
||||||
db_path = Path(temp_path)
|
|
||||||
|
|
||||||
# Set the test database path
|
|
||||||
original_path = path_manager._database_path
|
|
||||||
path_manager._database_path = db_path
|
|
||||||
|
|
||||||
# Reset the engine to use the new database path
|
|
||||||
import leggen.services.database as db_module
|
|
||||||
|
|
||||||
original_engine = db_module._engine
|
|
||||||
db_module._engine = None
|
|
||||||
|
|
||||||
# Initialize database tables
|
|
||||||
init_database()
|
|
||||||
|
|
||||||
yield db_path
|
|
||||||
|
|
||||||
# Cleanup - close any sessions first
|
|
||||||
if db_module._engine:
|
|
||||||
db_module._engine.dispose()
|
|
||||||
db_module._engine = original_engine
|
|
||||||
path_manager._database_path = original_path
|
|
||||||
if db_path.exists():
|
|
||||||
db_path.unlink()
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
|
||||||
def database_service(test_db_path):
|
|
||||||
"""Create a database service instance for testing with real database."""
|
|
||||||
return DatabaseService()
|
return DatabaseService()
|
||||||
|
|
||||||
|
|
||||||
@@ -320,7 +282,6 @@ class TestDatabaseService:
|
|||||||
"""Test successful balance persistence."""
|
"""Test successful balance persistence."""
|
||||||
balance_data = {
|
balance_data = {
|
||||||
"institution_id": "REVOLUT_REVOLT21",
|
"institution_id": "REVOLUT_REVOLT21",
|
||||||
"account_status": "active",
|
|
||||||
"iban": "LT313250081177977789",
|
"iban": "LT313250081177977789",
|
||||||
"balances": [
|
"balances": [
|
||||||
{
|
{
|
||||||
@@ -330,23 +291,26 @@ class TestDatabaseService:
|
|||||||
],
|
],
|
||||||
}
|
}
|
||||||
|
|
||||||
# Test actual persistence
|
with patch("sqlite3.connect") as mock_connect:
|
||||||
await database_service._persist_balance_sqlite("test-account-123", balance_data)
|
mock_conn = mock_connect.return_value
|
||||||
|
mock_cursor = mock_conn.cursor.return_value
|
||||||
|
|
||||||
# Verify balance was persisted
|
await database_service._persist_balance_sqlite(
|
||||||
balances = await database_service.get_balances_from_db("test-account-123")
|
"test-account-123", balance_data
|
||||||
assert len(balances) == 1
|
)
|
||||||
assert balances[0]["account_id"] == "test-account-123"
|
|
||||||
assert balances[0]["amount"] == 1000.0
|
# Verify database operations
|
||||||
assert balances[0]["currency"] == "EUR"
|
mock_connect.assert_called()
|
||||||
|
mock_cursor.execute.assert_called() # Table creation and insert
|
||||||
|
mock_conn.commit.assert_called_once()
|
||||||
|
mock_conn.close.assert_called_once()
|
||||||
|
|
||||||
async def test_persist_balance_sqlite_error(self, database_service):
|
async def test_persist_balance_sqlite_error(self, database_service):
|
||||||
"""Test handling error during balance persistence."""
|
"""Test handling error during balance persistence."""
|
||||||
balance_data = {"balances": []}
|
balance_data = {"balances": []}
|
||||||
|
|
||||||
# Mock get_session to raise an error
|
with patch("sqlite3.connect") as mock_connect:
|
||||||
with patch("leggen.services.database_service.get_session") as mock_session:
|
mock_connect.side_effect = Exception("Database error")
|
||||||
mock_session.side_effect = Exception("Database error")
|
|
||||||
|
|
||||||
with pytest.raises(Exception, match="Database error"):
|
with pytest.raises(Exception, match="Database error"):
|
||||||
await database_service._persist_balance_sqlite(
|
await database_service._persist_balance_sqlite(
|
||||||
@@ -357,48 +321,52 @@ class TestDatabaseService:
|
|||||||
self, database_service, sample_transactions_db_format
|
self, database_service, sample_transactions_db_format
|
||||||
):
|
):
|
||||||
"""Test successful transaction persistence."""
|
"""Test successful transaction persistence."""
|
||||||
|
with patch("sqlite3.connect") as mock_connect:
|
||||||
|
mock_conn = mock_connect.return_value
|
||||||
|
mock_cursor = mock_conn.cursor.return_value
|
||||||
|
# Mock fetchone to return (0,) indicating transaction doesn't exist yet
|
||||||
|
mock_cursor.fetchone.return_value = (0,)
|
||||||
|
|
||||||
result = await database_service._persist_transactions_sqlite(
|
result = await database_service._persist_transactions_sqlite(
|
||||||
"test-account-123", sample_transactions_db_format
|
"test-account-123", sample_transactions_db_format
|
||||||
)
|
)
|
||||||
|
|
||||||
# Should return all transactions as new
|
# Should return the transactions (assuming no duplicates)
|
||||||
assert len(result) == 2
|
assert len(result) >= 0 # Could be empty if all are duplicates
|
||||||
|
|
||||||
# Verify transactions were persisted
|
# Verify database operations
|
||||||
transactions = await database_service.get_transactions_from_db(
|
mock_connect.assert_called()
|
||||||
account_id="test-account-123"
|
mock_cursor.execute.assert_called()
|
||||||
)
|
mock_conn.commit.assert_called_once()
|
||||||
assert len(transactions) == 2
|
mock_conn.close.assert_called_once()
|
||||||
assert transactions[0]["accountId"] == "test-account-123"
|
|
||||||
|
|
||||||
async def test_persist_transactions_sqlite_duplicate_detection(
|
async def test_persist_transactions_sqlite_duplicate_detection(
|
||||||
self, database_service, sample_transactions_db_format
|
self, database_service, sample_transactions_db_format
|
||||||
):
|
):
|
||||||
"""Test that existing transactions are not returned as new."""
|
"""Test that existing transactions are not returned as new."""
|
||||||
# First insert
|
with patch("sqlite3.connect") as mock_connect:
|
||||||
result1 = await database_service._persist_transactions_sqlite(
|
mock_conn = mock_connect.return_value
|
||||||
"test-account-123", sample_transactions_db_format
|
mock_cursor = mock_conn.cursor.return_value
|
||||||
)
|
# Mock fetchone to return (1,) indicating transaction already exists
|
||||||
assert len(result1) == 2
|
mock_cursor.fetchone.return_value = (1,)
|
||||||
|
|
||||||
# Second insert (duplicates)
|
result = await database_service._persist_transactions_sqlite(
|
||||||
result2 = await database_service._persist_transactions_sqlite(
|
|
||||||
"test-account-123", sample_transactions_db_format
|
"test-account-123", sample_transactions_db_format
|
||||||
)
|
)
|
||||||
|
|
||||||
# Should return empty list since all transactions already exist
|
# Should return empty list since all transactions already exist
|
||||||
assert len(result2) == 0
|
assert len(result) == 0
|
||||||
|
|
||||||
# Verify still only 2 transactions in database
|
# Verify database operations still happened (INSERT OR REPLACE executed)
|
||||||
transactions = await database_service.get_transactions_from_db(
|
mock_connect.assert_called()
|
||||||
account_id="test-account-123"
|
mock_cursor.execute.assert_called()
|
||||||
)
|
mock_conn.commit.assert_called_once()
|
||||||
assert len(transactions) == 2
|
mock_conn.close.assert_called_once()
|
||||||
|
|
||||||
async def test_persist_transactions_sqlite_error(self, database_service):
|
async def test_persist_transactions_sqlite_error(self, database_service):
|
||||||
"""Test handling error during transaction persistence."""
|
"""Test handling error during transaction persistence."""
|
||||||
with patch("leggen.services.database_service.get_session") as mock_session:
|
with patch("sqlite3.connect") as mock_connect:
|
||||||
mock_session.side_effect = Exception("Database error")
|
mock_connect.side_effect = Exception("Database error")
|
||||||
|
|
||||||
with pytest.raises(Exception, match="Database error"):
|
with pytest.raises(Exception, match="Database error"):
|
||||||
await database_service._persist_transactions_sqlite(
|
await database_service._persist_transactions_sqlite(
|
||||||
|
|||||||
113
uv.lock
generated
113
uv.lock
generated
@@ -2,20 +2,6 @@ version = 1
|
|||||||
revision = 3
|
revision = 3
|
||||||
requires-python = "==3.13.*"
|
requires-python = "==3.13.*"
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "alembic"
|
|
||||||
version = "1.16.5"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "mako" },
|
|
||||||
{ name = "sqlalchemy" },
|
|
||||||
{ name = "typing-extensions" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/9a/ca/4dc52902cf3491892d464f5265a81e9dff094692c8a049a3ed6a05fe7ee8/alembic-1.16.5.tar.gz", hash = "sha256:a88bb7f6e513bd4301ecf4c7f2206fe93f9913f9b48dac3b78babde2d6fe765e", size = 1969868, upload-time = "2025-08-27T18:02:05.668Z" }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/39/4a/4c61d4c84cfd9befb6fa08a702535b27b21fff08c946bc2f6139decbf7f7/alembic-1.16.5-py3-none-any.whl", hash = "sha256:e845dfe090c5ffa7b92593ae6687c5cb1a101e91fa53868497dbd79847f9dbe3", size = 247355, upload-time = "2025-08-27T18:02:07.37Z" },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "annotated-types"
|
name = "annotated-types"
|
||||||
version = "0.7.0"
|
version = "0.7.0"
|
||||||
@@ -181,23 +167,6 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/42/14/42b2651a2f46b022ccd948bca9f2d5af0fd8929c4eec235b8d6d844fbe67/filelock-3.19.1-py3-none-any.whl", hash = "sha256:d38e30481def20772f5baf097c122c3babc4fcdb7e14e57049eb9d88c6dc017d", size = 15988, upload-time = "2025-08-14T16:56:01.633Z" },
|
{ url = "https://files.pythonhosted.org/packages/42/14/42b2651a2f46b022ccd948bca9f2d5af0fd8929c4eec235b8d6d844fbe67/filelock-3.19.1-py3-none-any.whl", hash = "sha256:d38e30481def20772f5baf097c122c3babc4fcdb7e14e57049eb9d88c6dc017d", size = 15988, upload-time = "2025-08-14T16:56:01.633Z" },
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "greenlet"
|
|
||||||
version = "3.2.4"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/03/b8/704d753a5a45507a7aab61f18db9509302ed3d0a27ac7e0359ec2905b1a6/greenlet-3.2.4.tar.gz", hash = "sha256:0dca0d95ff849f9a364385f36ab49f50065d76964944638be9691e1832e9f86d", size = 188260, upload-time = "2025-08-07T13:24:33.51Z" }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/49/e8/58c7f85958bda41dafea50497cbd59738c5c43dbbea5ee83d651234398f4/greenlet-3.2.4-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:1a921e542453fe531144e91e1feedf12e07351b1cf6c9e8a3325ea600a715a31", size = 272814, upload-time = "2025-08-07T13:15:50.011Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/62/dd/b9f59862e9e257a16e4e610480cfffd29e3fae018a68c2332090b53aac3d/greenlet-3.2.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:cd3c8e693bff0fff6ba55f140bf390fa92c994083f838fece0f63be121334945", size = 641073, upload-time = "2025-08-07T13:42:57.23Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/f7/0b/bc13f787394920b23073ca3b6c4a7a21396301ed75a655bcb47196b50e6e/greenlet-3.2.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:710638eb93b1fa52823aa91bf75326f9ecdfd5e0466f00789246a5280f4ba0fc", size = 655191, upload-time = "2025-08-07T13:45:29.752Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/f2/d6/6adde57d1345a8d0f14d31e4ab9c23cfe8e2cd39c3baf7674b4b0338d266/greenlet-3.2.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:c5111ccdc9c88f423426df3fd1811bfc40ed66264d35aa373420a34377efc98a", size = 649516, upload-time = "2025-08-07T13:53:16.314Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/7f/3b/3a3328a788d4a473889a2d403199932be55b1b0060f4ddd96ee7cdfcad10/greenlet-3.2.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d76383238584e9711e20ebe14db6c88ddcedc1829a9ad31a584389463b5aa504", size = 652169, upload-time = "2025-08-07T13:18:32.861Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/ee/43/3cecdc0349359e1a527cbf2e3e28e5f8f06d3343aaf82ca13437a9aa290f/greenlet-3.2.4-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:23768528f2911bcd7e475210822ffb5254ed10d71f4028387e5a99b4c6699671", size = 610497, upload-time = "2025-08-07T13:18:31.636Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/b8/19/06b6cf5d604e2c382a6f31cafafd6f33d5dea706f4db7bdab184bad2b21d/greenlet-3.2.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:00fadb3fedccc447f517ee0d3fd8fe49eae949e1cd0f6a611818f4f6fb7dc83b", size = 1121662, upload-time = "2025-08-07T13:42:41.117Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/a2/15/0d5e4e1a66fab130d98168fe984c509249c833c1a3c16806b90f253ce7b9/greenlet-3.2.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d25c5091190f2dc0eaa3f950252122edbbadbb682aa7b1ef2f8af0f8c0afefae", size = 1149210, upload-time = "2025-08-07T13:18:24.072Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/0b/55/2321e43595e6801e105fcfdee02b34c0f996eb71e6ddffca6b10b7e1d771/greenlet-3.2.4-cp313-cp313-win_amd64.whl", hash = "sha256:554b03b6e73aaabec3745364d6239e9e012d64c68ccd0b8430c64ccc14939a8b", size = 299685, upload-time = "2025-08-07T13:24:38.824Z" },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "h11"
|
name = "h11"
|
||||||
version = "0.16.0"
|
version = "0.16.0"
|
||||||
@@ -288,10 +257,9 @@ wheels = [
|
|||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "leggen"
|
name = "leggen"
|
||||||
version = "2025.9.26"
|
version = "2025.11.0"
|
||||||
source = { editable = "." }
|
source = { editable = "." }
|
||||||
dependencies = [
|
dependencies = [
|
||||||
{ name = "alembic" },
|
|
||||||
{ name = "apscheduler" },
|
{ name = "apscheduler" },
|
||||||
{ name = "boto3" },
|
{ name = "boto3" },
|
||||||
{ name = "click" },
|
{ name = "click" },
|
||||||
@@ -301,7 +269,6 @@ dependencies = [
|
|||||||
{ name = "loguru" },
|
{ name = "loguru" },
|
||||||
{ name = "pydantic" },
|
{ name = "pydantic" },
|
||||||
{ name = "requests" },
|
{ name = "requests" },
|
||||||
{ name = "sqlmodel" },
|
|
||||||
{ name = "tabulate" },
|
{ name = "tabulate" },
|
||||||
{ name = "tomli-w" },
|
{ name = "tomli-w" },
|
||||||
{ name = "uvicorn", extra = ["standard"] },
|
{ name = "uvicorn", extra = ["standard"] },
|
||||||
@@ -323,7 +290,6 @@ dev = [
|
|||||||
|
|
||||||
[package.metadata]
|
[package.metadata]
|
||||||
requires-dist = [
|
requires-dist = [
|
||||||
{ name = "alembic", specifier = ">=1.16.5" },
|
|
||||||
{ name = "apscheduler", specifier = ">=3.10.0,<4" },
|
{ name = "apscheduler", specifier = ">=3.10.0,<4" },
|
||||||
{ name = "boto3", specifier = ">=1.35.0,<2" },
|
{ name = "boto3", specifier = ">=1.35.0,<2" },
|
||||||
{ name = "click", specifier = ">=8.1.7,<9" },
|
{ name = "click", specifier = ">=8.1.7,<9" },
|
||||||
@@ -333,7 +299,6 @@ requires-dist = [
|
|||||||
{ name = "loguru", specifier = ">=0.7.2,<0.8" },
|
{ name = "loguru", specifier = ">=0.7.2,<0.8" },
|
||||||
{ name = "pydantic", specifier = ">=2.0.0,<3" },
|
{ name = "pydantic", specifier = ">=2.0.0,<3" },
|
||||||
{ name = "requests", specifier = ">=2.31.0,<3" },
|
{ name = "requests", specifier = ">=2.31.0,<3" },
|
||||||
{ name = "sqlmodel", specifier = ">=0.0.25" },
|
|
||||||
{ name = "tabulate", specifier = ">=0.9.0,<0.10" },
|
{ name = "tabulate", specifier = ">=0.9.0,<0.10" },
|
||||||
{ name = "tomli-w", specifier = ">=1.0.0,<2" },
|
{ name = "tomli-w", specifier = ">=1.0.0,<2" },
|
||||||
{ name = "uvicorn", extras = ["standard"], specifier = ">=0.24.0,<1" },
|
{ name = "uvicorn", extras = ["standard"], specifier = ">=0.24.0,<1" },
|
||||||
@@ -366,48 +331,6 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/0c/29/0348de65b8cc732daa3e33e67806420b2ae89bdce2b04af740289c5c6c8c/loguru-0.7.3-py3-none-any.whl", hash = "sha256:31a33c10c8e1e10422bfd431aeb5d351c7cf7fa671e3c4df004162264b28220c", size = 61595, upload-time = "2024-12-06T11:20:54.538Z" },
|
{ url = "https://files.pythonhosted.org/packages/0c/29/0348de65b8cc732daa3e33e67806420b2ae89bdce2b04af740289c5c6c8c/loguru-0.7.3-py3-none-any.whl", hash = "sha256:31a33c10c8e1e10422bfd431aeb5d351c7cf7fa671e3c4df004162264b28220c", size = 61595, upload-time = "2024-12-06T11:20:54.538Z" },
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "mako"
|
|
||||||
version = "1.3.10"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "markupsafe" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/9e/38/bd5b78a920a64d708fe6bc8e0a2c075e1389d53bef8413725c63ba041535/mako-1.3.10.tar.gz", hash = "sha256:99579a6f39583fa7e5630a28c3c1f440e4e97a414b80372649c0ce338da2ea28", size = 392474, upload-time = "2025-04-10T12:44:31.16Z" }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/87/fb/99f81ac72ae23375f22b7afdb7642aba97c00a713c217124420147681a2f/mako-1.3.10-py3-none-any.whl", hash = "sha256:baef24a52fc4fc514a0887ac600f9f1cff3d82c61d4d700a1fa84d597b88db59", size = 78509, upload-time = "2025-04-10T12:50:53.297Z" },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "markupsafe"
|
|
||||||
version = "3.0.3"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/7e/99/7690b6d4034fffd95959cbe0c02de8deb3098cc577c67bb6a24fe5d7caa7/markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698", size = 80313, upload-time = "2025-09-27T18:37:40.426Z" }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/38/2f/907b9c7bbba283e68f20259574b13d005c121a0fa4c175f9bed27c4597ff/markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795", size = 11622, upload-time = "2025-09-27T18:36:41.777Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/9c/d9/5f7756922cdd676869eca1c4e3c0cd0df60ed30199ffd775e319089cb3ed/markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219", size = 12029, upload-time = "2025-09-27T18:36:43.257Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/00/07/575a68c754943058c78f30db02ee03a64b3c638586fba6a6dd56830b30a3/markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6", size = 24374, upload-time = "2025-09-27T18:36:44.508Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/a9/21/9b05698b46f218fc0e118e1f8168395c65c8a2c750ae2bab54fc4bd4e0e8/markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676", size = 22980, upload-time = "2025-09-27T18:36:45.385Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/7f/71/544260864f893f18b6827315b988c146b559391e6e7e8f7252839b1b846a/markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9", size = 21990, upload-time = "2025-09-27T18:36:46.916Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/c2/28/b50fc2f74d1ad761af2f5dcce7492648b983d00a65b8c0e0cb457c82ebbe/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1", size = 23784, upload-time = "2025-09-27T18:36:47.884Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/ed/76/104b2aa106a208da8b17a2fb72e033a5a9d7073c68f7e508b94916ed47a9/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc", size = 21588, upload-time = "2025-09-27T18:36:48.82Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/b5/99/16a5eb2d140087ebd97180d95249b00a03aa87e29cc224056274f2e45fd6/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12", size = 23041, upload-time = "2025-09-27T18:36:49.797Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/19/bc/e7140ed90c5d61d77cea142eed9f9c303f4c4806f60a1044c13e3f1471d0/markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed", size = 14543, upload-time = "2025-09-27T18:36:51.584Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/05/73/c4abe620b841b6b791f2edc248f556900667a5a1cf023a6646967ae98335/markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5", size = 15113, upload-time = "2025-09-27T18:36:52.537Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/f0/3a/fa34a0f7cfef23cf9500d68cb7c32dd64ffd58a12b09225fb03dd37d5b80/markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485", size = 13911, upload-time = "2025-09-27T18:36:53.513Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/e4/d7/e05cd7efe43a88a17a37b3ae96e79a19e846f3f456fe79c57ca61356ef01/markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73", size = 11658, upload-time = "2025-09-27T18:36:54.819Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/99/9e/e412117548182ce2148bdeacdda3bb494260c0b0184360fe0d56389b523b/markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37", size = 12066, upload-time = "2025-09-27T18:36:55.714Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/bc/e6/fa0ffcda717ef64a5108eaa7b4f5ed28d56122c9a6d70ab8b72f9f715c80/markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19", size = 25639, upload-time = "2025-09-27T18:36:56.908Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/96/ec/2102e881fe9d25fc16cb4b25d5f5cde50970967ffa5dddafdb771237062d/markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025", size = 23569, upload-time = "2025-09-27T18:36:57.913Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/4b/30/6f2fce1f1f205fc9323255b216ca8a235b15860c34b6798f810f05828e32/markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6", size = 23284, upload-time = "2025-09-27T18:36:58.833Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/58/47/4a0ccea4ab9f5dcb6f79c0236d954acb382202721e704223a8aafa38b5c8/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f", size = 24801, upload-time = "2025-09-27T18:36:59.739Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/6a/70/3780e9b72180b6fecb83a4814d84c3bf4b4ae4bf0b19c27196104149734c/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb", size = 22769, upload-time = "2025-09-27T18:37:00.719Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/98/c5/c03c7f4125180fc215220c035beac6b9cb684bc7a067c84fc69414d315f5/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009", size = 23642, upload-time = "2025-09-27T18:37:01.673Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/80/d6/2d1b89f6ca4bff1036499b1e29a1d02d282259f3681540e16563f27ebc23/markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354", size = 14612, upload-time = "2025-09-27T18:37:02.639Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/2b/98/e48a4bfba0a0ffcf9925fe2d69240bfaa19c6f7507b8cd09c70684a53c1e/markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218", size = 15200, upload-time = "2025-09-27T18:37:03.582Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/0e/72/e3cc540f351f316e9ed0f092757459afbc595824ca724cbc5a5d4263713f/markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287", size = 13973, upload-time = "2025-09-27T18:37:04.929Z" },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "mypy"
|
name = "mypy"
|
||||||
version = "1.17.1"
|
version = "1.17.1"
|
||||||
@@ -723,40 +646,6 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" },
|
{ url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" },
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "sqlalchemy"
|
|
||||||
version = "2.0.43"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "greenlet", marker = "platform_machine == 'AMD64' or platform_machine == 'WIN32' or platform_machine == 'aarch64' or platform_machine == 'amd64' or platform_machine == 'ppc64le' or platform_machine == 'win32' or platform_machine == 'x86_64'" },
|
|
||||||
{ name = "typing-extensions" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/d7/bc/d59b5d97d27229b0e009bd9098cd81af71c2fa5549c580a0a67b9bed0496/sqlalchemy-2.0.43.tar.gz", hash = "sha256:788bfcef6787a7764169cfe9859fe425bf44559619e1d9f56f5bddf2ebf6f417", size = 9762949, upload-time = "2025-08-11T14:24:58.438Z" }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/41/1c/a7260bd47a6fae7e03768bf66451437b36451143f36b285522b865987ced/sqlalchemy-2.0.43-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e7c08f57f75a2bb62d7ee80a89686a5e5669f199235c6d1dac75cd59374091c3", size = 2130598, upload-time = "2025-08-11T15:51:15.903Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/8e/84/8a337454e82388283830b3586ad7847aa9c76fdd4f1df09cdd1f94591873/sqlalchemy-2.0.43-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:14111d22c29efad445cd5021a70a8b42f7d9152d8ba7f73304c4d82460946aaa", size = 2118415, upload-time = "2025-08-11T15:51:17.256Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/cf/ff/22ab2328148492c4d71899d62a0e65370ea66c877aea017a244a35733685/sqlalchemy-2.0.43-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:21b27b56eb2f82653168cefe6cb8e970cdaf4f3a6cb2c5e3c3c1cf3158968ff9", size = 3248707, upload-time = "2025-08-11T15:52:38.444Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/dc/29/11ae2c2b981de60187f7cbc84277d9d21f101093d1b2e945c63774477aba/sqlalchemy-2.0.43-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c5a9da957c56e43d72126a3f5845603da00e0293720b03bde0aacffcf2dc04f", size = 3253602, upload-time = "2025-08-11T15:56:37.348Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/b8/61/987b6c23b12c56d2be451bc70900f67dd7d989d52b1ee64f239cf19aec69/sqlalchemy-2.0.43-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5d79f9fdc9584ec83d1b3c75e9f4595c49017f5594fee1a2217117647225d738", size = 3183248, upload-time = "2025-08-11T15:52:39.865Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/86/85/29d216002d4593c2ce1c0ec2cec46dda77bfbcd221e24caa6e85eff53d89/sqlalchemy-2.0.43-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9df7126fd9db49e3a5a3999442cc67e9ee8971f3cb9644250107d7296cb2a164", size = 3219363, upload-time = "2025-08-11T15:56:39.11Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/b6/e4/bd78b01919c524f190b4905d47e7630bf4130b9f48fd971ae1c6225b6f6a/sqlalchemy-2.0.43-cp313-cp313-win32.whl", hash = "sha256:7f1ac7828857fcedb0361b48b9ac4821469f7694089d15550bbcf9ab22564a1d", size = 2096718, upload-time = "2025-08-11T15:55:05.349Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/ac/a5/ca2f07a2a201f9497de1928f787926613db6307992fe5cda97624eb07c2f/sqlalchemy-2.0.43-cp313-cp313-win_amd64.whl", hash = "sha256:971ba928fcde01869361f504fcff3b7143b47d30de188b11c6357c0505824197", size = 2123200, upload-time = "2025-08-11T15:55:07.932Z" },
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/b8/d9/13bdde6521f322861fab67473cec4b1cc8999f3871953531cf61945fad92/sqlalchemy-2.0.43-py3-none-any.whl", hash = "sha256:1681c21dd2ccee222c2fe0bef671d1aef7c504087c9c4e800371cfcc8ac966fc", size = 1924759, upload-time = "2025-08-11T15:39:53.024Z" },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
|
||||||
name = "sqlmodel"
|
|
||||||
version = "0.0.25"
|
|
||||||
source = { registry = "https://pypi.org/simple" }
|
|
||||||
dependencies = [
|
|
||||||
{ name = "pydantic" },
|
|
||||||
{ name = "sqlalchemy" },
|
|
||||||
]
|
|
||||||
sdist = { url = "https://files.pythonhosted.org/packages/ea/80/d9c098a88724ee4554907939cf39590cf67e10c6683723216e228d3315f7/sqlmodel-0.0.25.tar.gz", hash = "sha256:56548c2e645975b1ed94d6c53f0d13c85593f57926a575e2bf566650b2243fa4", size = 117075, upload-time = "2025-09-17T21:44:41.219Z" }
|
|
||||||
wheels = [
|
|
||||||
{ url = "https://files.pythonhosted.org/packages/57/cf/5d175ce8de07fe694ec4e3d4d65c2dd06cc30f6c79599b31f9d2f6dd2830/sqlmodel-0.0.25-py3-none-any.whl", hash = "sha256:c98234cda701fb77e9dcbd81688c23bb251c13bb98ce1dd8d4adc467374d45b7", size = 28893, upload-time = "2025-09-17T21:44:39.764Z" },
|
|
||||||
]
|
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "starlette"
|
name = "starlette"
|
||||||
version = "0.47.3"
|
version = "0.47.3"
|
||||||
|
|||||||
Reference in New Issue
Block a user