AlphaDash is a full-stack web application designed to help users track their financial asset portfolios, visualize historical performance, and explore basic trading signals. It demonstrates a range of modern software engineering practices and technologies.
- User Authentication (Registration, Login with JWT)
- Portfolio Management:
- Add/Edit/Delete asset holdings (quantity, purchase price, date)
- View portfolio summary with real-time(ish) valuation
- Calculated current values, gains/losses per holding and total
- Asset Management:
- System for defining global assets (stocks, cryptocurrencies)
- Users can create new assets if not already in the system
- Financial Data Integration:
- Fetches current prices and historical data from external financial APIs (primarily Yahoo Finance, with Alpha Vantage as an optional fallback)
- Caching mechanism for external API responses (Redis)
- Data Visualization:
- Historical price charts for individual assets
- Display of Simple Moving Averages (SMA 20, SMA 50) on charts
- Asynchronous Task Processing:
- Background tasks (Celery with Redis broker) for periodic data refresh (all asset prices)
- Containerized Deployment:
- Fully containerized using Docker and Docker Compose for easy setup and consistent environments.
- Automated Testing & CI:
- Comprehensive unit tests for backend (Pytest)
- Basic CI pipeline using GitHub Actions (linting, formatting, testing, build checks)
- Language: Python 3.13
- Framework: FastAPI
- Authentication: JWT (python-jose), Passlib (for password hashing)
- Database: PostgreSQL
- ORM: SQLAlchemy with Alembic for migrations
- Data Validation: Pydantic
- Asynchronous Tasks: Celery, Redis (as broker and result backend)
- Caching: Redis (for application-level shared cache)
- External APIs: Yahoo Finance (via
yfinance
), Alpha Vantage (as fallback) - Linters/Formatters: Ruff, Black
- Language: TypeScript
- Framework/Library: React
- State Management: React Hooks like
useState
,useEffect
- Routing: React Router DOM
- API Client: Axios
- Charting: Chart.js with react-chartjs-2, date-fns adapter
- Styling: Basic CSS/Inline Styles
- Containerization: Docker, Docker Compose
- CI/CD: GitHub Actions (Linting, Formatting, Testing, Docker Build)
- Version Control: Git, GitHub
- Development Methodology: Issue-driven development (using GitHub Issues and PRs for feature tracking and code review)
The application follows a modern full-stack architecture:
- Frontend: A React application responsible for user interaction and presentation.
- Backend: A Python FastAPI application serving a RESTful API for all business logic, data processing, and external API interactions.
- Database: PostgreSQL stores persistent data like user information, asset definitions, and portfolio holdings.
- Caching Layer: Redis is used for caching frequently accessed data (e.g., asset prices from external APIs) to improve performance and reduce external API load.
- Task Queue: Celery with Redis as a broker handles asynchronous background tasks, such as periodic refreshment of asset prices.
- Containerization: All services (frontend, backend, database, Redis, Celery workers/beat) are containerized with Docker and orchestrated using Docker Compose for local development and ensuring a consistent environment.
Simple text diagram:
[ User (Browser) ] <--> [ React Frontend (localhost:3000) ]
^
| (HTTP API Calls)
v
[ FastAPI Backend (localhost:8000) ] <--> [ PostgreSQL DB ]
^ ^ |
| | +--> [ Redis (Cache & Celery Broker) ]
| |
+---------+------> [ Celery Workers & Beat (Background Tasks) ]
|
+------> [ External Financial APIs (Yahoo Finance, Alpha Vantage) ]
- Docker Desktop (or Docker Engine + Docker Compose) installed and running.
- Git installed.
- (For direct backend/frontend dev without full Docker Compose build initially) Python 3.11+ and Node.js latest LTS version installed.
-
Clone the repository:
git clone https://github.com/zhu-weijie/alpha-dash.git cd alpha-dash
-
Environment Configuration:
- Root
.env
for Docker Compose: Create a.env
file in the project root (same directory asdocker-compose.yml
). This file is used by Docker Compose to inject environment variables into services. Refer to.env.example
for required variables. Example content foralpha-dash/.env
:SECRET_KEY="a_very_strong_random_secret_key_for_jwt_at_least_32_chars" ALGORITHM="HS256" ALPHA_VANTAGE_API_KEY="YOUR_OPTIONAL_AV_KEY" # Other variables like PROJECT_NAME, ACCESS_TOKEN_EXPIRE_MINUTES if you want to override defaults
- Backend
.env
for direct Python execution (and source for Pydantic settings if CI env vars not set): Navigate to thebackend/
directory. Copybackend/.env.example
tobackend/.env
and fill in the required values (especially if you plan to run the backend outside Docker for some tests, or if your Celery tasks/Alembic need it when run locally).cd backend cp .env.example .env # Edit backend/.env with your local settings (e.g., DATABASE_URL if running DB outside Docker) # However, for Docker Compose, the DATABASE_URL inside the container is set by docker-compose.yml cd ..
- Root
-
Build and Run with Docker Compose: From the project root directory (where
docker-compose.yml
is located):docker compose up --build -d
--build
: Builds the Docker images for the backend and frontend if they don't exist or if Dockerfiles have changed.-d
: Runs the containers in detached mode (in the background).
-
Apply Database Migrations (if first time or after DB reset): The backend service itself does not automatically run migrations on startup. You need to run them manually against the running database container. Open a new terminal:
# Navigate to project root cd alpha-dash # Execute alembic upgrade command inside the backend container (or locally if DB exposed and configured) docker compose exec backend alembic upgrade head # OR, if running locally from backend directory (ensure venv active, and ensure your backend/.env is configured to point to the Dockerized PostgreSQL (e.g., DATABASE_URL=postgresql://alphadash_user:alphadash_pass@localhost:5432/alphadash_db) if running Alembic locally against the Docker DB.): # cd backend # python3.13 -m venv .venv # source .venv/bin/activate # pip install -r requirements.txt # alembic upgrade head
-
Accessing the Application:
- Frontend: Open your browser and navigate to
http://localhost:3000
- Backend API Docs (Swagger UI):
http://localhost:8000/docs
- Backend API Docs (ReDoc):
http://localhost:8000/redoc
- Frontend: Open your browser and navigate to
-
Stopping the Application: From the project root directory:
docker compose down
To also remove volumes (database data, Redis data, Celery beat schedule):
docker compose down --volumes
The backend provides a RESTful API. Key endpoint groups include:
/api/v1/auth/token
: User login and JWT generation./api/v1/users/
: User registration, fetching current user (/me
)./api/v1/assets/
: CRUD operations for global asset definitions./api/v1/portfolio/holdings/
: CRUD operations for user-specific portfolio holdings./api/v1/market-data/{symbol}/
: Fetching current price and historical data for assets.
For detailed API documentation, please run the application and visit http://localhost:8000/docs
.
Here is a screenshot of the API docs:
- Full-Stack Development: Building both backend (Python/FastAPI) and frontend (React/TypeScript) components.
- API Design & Development: Creating RESTful APIs with FastAPI, including request/response validation (Pydantic), authentication (JWT), and clear endpoint structuring.
- Database Management:
- Working with PostgreSQL.
- Using SQLAlchemy for ORM and database interaction.
- Managing database schema migrations with Alembic.
- Authentication & Authorization: Implementing secure user registration, password hashing (Passlib/bcrypt), and token-based authentication (JWT). Protecting API endpoints.
- Asynchronous Task Processing: Utilizing Celery and Redis for background tasks (periodic data refresh), demonstrating understanding of distributed systems concepts.
- Caching Strategies: Implementing caching (Redis) for external API responses to improve performance and manage rate limits.
- External API Integration: Fetching and processing data from third-party financial APIs (Yahoo Finance, Alpha Vantage).
- Containerization: Dockerizing all application services (backend, frontend, database, Redis, Celery) and orchestrating them with Docker Compose for development and consistent environments.
- Testing:
- Writing unit tests for backend logic (CRUD, services, utilities) using Pytest and
unittest.mock
. - Implementing basic API endpoint tests using FastAPI's
TestClient
.
- Writing unit tests for backend logic (CRUD, services, utilities) using Pytest and
- DevOps & CI/CD: Setting up a basic Continuous Integration pipeline with GitHub Actions for automated linting, formatting, testing, and build checks.
- Modern Python & TypeScript: Utilizing modern language features, type hinting, and asynchronous programming (FastAPI).
- Software Design Patterns: (Implicitly) Service layer, data provider pattern, dependency injection (FastAPI).
- Problem Solving: Addressing challenges like API rate limits, data serialization, and inter-service communication within a Dockerized environment.
- Project Structure & Maintainability: Organizing code into logical modules and packages for better maintainability and scalability.
- Advanced "Alpha" Signals: Implement more complex technical indicators or allow users to define simple signal conditions.
- Real-time Price Updates: Integrate WebSockets for real-time price updates on the portfolio page.
- User Roles & Permissions: Differentiate between regular users and admin users (e.g., for managing global assets).
- More Comprehensive Testing: Add frontend unit/integration tests, expand backend integration test coverage.
- Deployment: Script deployment to a cloud platform (e.g., GCP Cloud Run, AWS ECS/EKS).
- Enhanced UI/UX: Utilize a component library for a more polished UI, improve form validations, and user feedback.
- Scalability Improvements for Data Fetching: Implement more advanced batching or a dedicated data ingestion pipeline for financial data if scaling to many users/assets.
- OAuth2 Scopes: Implement more granular permissions using OAuth2 scopes.
- Portfolio Overview Page
- Asset Chart Page
- Add New Holding Page
- Edit Holding Page
- Add new asset page
- Redis Cache Logs
- Celery Worker Logs
- Celery Beat Logs
To better understand and illustrate the interconnected nature of the data within Alpha-Dash, I've modeled a small, representative subset of its core entities and relationships using Neo4j, a native graph database. This exploration helps visualize how users, their portfolio holdings, and specific assets are linked, which is foundational to knowledge graph concepts and can unlock deeper insights.
Conceptual Graph Visualization:
Core Entities & Relationships Modeled:
- Nodes (Entities):
:User
(Properties:userId
,email
,name
):Asset
(Properties:assetId
,symbol
,name
,type
):PortfolioHolding
(Properties:holdingId
,quantity
,purchase_price
,purchase_date
)
- Relationships (Edges):
(User)-[:OWNS]->(PortfolioHolding)
: Represents a user owning a specific lot/holding.(PortfolioHolding)-[:IS_FOR_ASSET]->(Asset)
: Links a holding to the specific asset it represents.
Sample Cypher Script for Population:
The following Cypher script was used to create the sample graph in Neo4j:
// neo4j_exploration/populate_sample_graph.cypher
// Optional: Clear any existing data from previous runs
MATCH (n) DETACH DELETE n;
// --- Create Nodes ---
// User
CREATE (user1:User {userId: 1, email: '[email protected]', name: 'Zhu Weijie'});
// Assets
CREATE (asset1:Asset {assetId: 101, symbol: 'AAPL', name: 'Apple Inc.', type: 'stock'});
CREATE (asset2:Asset {assetId: 102, symbol: 'BTC', name: 'Bitcoin', type: 'crypto'});
CREATE (asset3:Asset {assetId: 103, symbol: 'MSFT', name: 'Microsoft Corp.', type: 'stock'});
// Portfolio Holdings for User1
CREATE (holding1:PortfolioHolding {holdingId: 201, quantity: 10, purchase_price: 150.00, purchase_date: date("2023-01-10")});
CREATE (holding2:PortfolioHolding {holdingId: 202, quantity: 0.5, purchase_price: 30000.00, purchase_date: date("2023-02-15")});
CREATE (holding3:PortfolioHolding {holdingId: 203, quantity: 5, purchase_price: 160.00, purchase_date: date("2023-03-20")});
// --- Create Relationships ---
// User1 OWNS Holding1 (AAPL)
MATCH (u:User {userId: 1}), (h:PortfolioHolding {holdingId: 201}) CREATE (u)-[:OWNS]->(h);
// User1 OWNS Holding2 (BTC)
MATCH (u:User {userId: 1}), (h:PortfolioHolding {holdingId: 202}) CREATE (u)-[:OWNS]->(h);
// User1 OWNS Holding3 (AAPL, second lot)
MATCH (u:User {userId: 1}), (h:PortfolioHolding {holdingId: 203}) CREATE (u)-[:OWNS]->(h);
// Holding1 IS_FOR_ASSET Asset1 (AAPL)
MATCH (h:PortfolioHolding {holdingId: 201}), (a:Asset {assetId: 101}) CREATE (h)-[:IS_FOR_ASSET]->(a);
// Holding2 IS_FOR_ASSET Asset2 (BTC)
MATCH (h:PortfolioHolding {holdingId: 202}), (a:Asset {assetId: 102}) CREATE (h)-[:IS_FOR_ASSET]->(a);
// Holding3 IS_FOR_ASSET Asset1 (AAPL)
MATCH (h:PortfolioHolding {holdingId: 203}), (a:Asset {assetId: 101}) CREATE (h)-[:IS_FOR_ASSET]->(a);
CALL db.awaitIndexes();
RETURN "Sample Alpha-Dash graph created successfully!" AS status;
erDiagram
users {
int id PK "Primary Key"
varchar email UK "Unique Key"
varchar hashed_password
boolean is_active
datetime created_at
datetime updated_at
}
assets {
int id PK "Primary Key"
varchar symbol UK "Unique Key, e.g., AAPL, BTC"
varchar name "e.g., Apple Inc."
varchar asset_type "Enum: 'stock', 'crypto'"
datetime created_at
datetime updated_at
datetime last_price_updated_at "Nullable, updated by Celery task"
}
portfolio_holdings {
int id PK "Primary Key"
int user_id FK "Foreign Key to users.id"
int asset_id FK "Foreign Key to assets.id"
float quantity "Refine: Use DECIMAL(19, 8) for precision"
float purchase_price "Refine: Use DECIMAL(19, 4) for precision"
datetime purchase_date
datetime created_at
datetime updated_at
}
users ||--o{ portfolio_holdings : owns
assets ||--o{ portfolio_holdings : "is held in"
classDiagram
direction TD
class AssetDetailAPI {
<<API Endpoint>>
+GET /signals/:symbol?short=20&long=50
}
note for AssetDetailAPI "Signal windows are now configurable via query params."
class SignalService {
<<Service>>
+ get_sma_crossover_signal(symbol, asset_type, short_window, long_window)
}
note for SignalService "New service for all analytical/signal calculations.\nKeeps API layer clean."
class FinancialDataOrchestrator {
<<Service>>
+ get_historical_data(symbol, asset_type, outputsize)
}
class BaseDataProvider {
<<Abstract>>
+fetch_historical_data()
}
note for BaseDataProvider "Abstract Base Class defines a contract for all data providers."
class YahooFinanceProvider {
<<DataProvider>>
+fetch_yf_historical_data()
}
class AlphaVantageProvider {
<<DataProvider>>
+fetch_av_stock_historical_data()
+fetch_av_crypto_historical_data()
}
class SignalResponse {
<<Schema>>
List historical_data
List signals
}
class SignalPoint {
<<Schema>>
date date
string signal_type
Decimal price
}
note for SignalPoint "Using Decimal type for financial precision."
AssetDetailAPI --> SignalService : uses
SignalService --> FinancialDataOrchestrator : uses
FinancialDataOrchestrator --> YahooFinanceProvider : "uses (primary)"
FinancialDataOrchestrator --> AlphaVantageProvider : "uses (fallback)"
BaseDataProvider <|-- YahooFinanceProvider : implements
BaseDataProvider <|-- AlphaVantageProvider : implements
AssetDetailAPI ..> SignalResponse : "returns"
SignalService ..> SignalResponse : "returns"
sequenceDiagram
actor User
participant Browser
participant AssetDetailPage as React Component
participant BackendAPI as /api/v1/signals/{symbol}
participant SignalService
participant DataOrchestrator
User->>Browser: Clicks on asset "AAPL" in portfolio
Browser->>AssetDetailPage: Renders page for "AAPL"
activate AssetDetailPage
AssetDetailPage->>BackendAPI: GET /api/v1/signals/AAPL
deactivate AssetDetailPage
activate BackendAPI
BackendAPI->>SignalService: get_sma_crossover_signal("AAPL", "stock")
activate SignalService
SignalService->>DataOrchestrator: get_historical_data("AAPL", "stock", "full")
activate DataOrchestrator
Note right of DataOrchestrator: Checks cache first.<br/>On miss, calls provider (e.g., yfinance).
DataOrchestrator-->>SignalService: Returns List[HistoricalPricePoint]
deactivate DataOrchestrator
Note right of SignalService: 1. Calculates 20-day SMA.<br/>2. Calculates 50-day SMA.<br/>3. Finds crossover points.
SignalService-->>BackendAPI: Returns { historical_data, signals: [...] }
deactivate SignalService
BackendAPI-->>Browser: 200 OK (JSON Response)
deactivate BackendAPI
Browser->>AssetDetailPage: Receives data
activate AssetDetailPage
Note right of AssetDetailPage: Updates component state and re-renders Chart.js component with new price and signal data.
deactivate AssetDetailPage
flowchart TD
A[Start: _find_crossovers] --> B{Receive price data, short SMA, long SMA};
B --> C[Initialize empty 'signals' list];
C --> D(Loop through data points from day 1 to end);
D --> E{Is there a previous day's data?};
E -->|Yes| F[Get today's and yesterday's values];
E -->|No| G(Continue to next day);
G --> D;
F --> H{Short SMA > Long SMA Today?};
H -->|Yes| I{Short SMA < Long SMA Yesterday?};
I -->|Yes| J[Add "Buy" signal for today's date to list];
I -->|No| K(Continue to next day);
H -->|No| L{Short SMA < Long SMA Today?};
L -->|Yes| M{Short SMA > Long SMA Yesterday?};
M -->|Yes| N[Add "Sell" signal for today's date to list];
M -->|No| K;
J --> K;
N --> K;
K --> D;
D --> O[End of Loop];
O --> P[Return 'signals' list];
P --> Q[End];
graph TD
subgraph "Routing"
App_tsx["App.tsx (Router)"]
end
subgraph "Pages"
PortfolioPage["PortfolioPage.tsx"]
AssetDetailPage["AssetDetailPage.tsx"]
end
subgraph "Components"
AssetChart["AssetChart.tsx"]
PeriodSelector["PeriodSelector.tsx"]
end
subgraph "Services"
apiService["apiService.ts"]
end
App_tsx --> PortfolioPage
App_tsx --> AssetDetailPage
PortfolioPage -- "Link to" --> AssetDetailPage
AssetDetailPage --> AssetChart
AssetDetailPage --> PeriodSelector
PortfolioPage -- "uses" --> apiService
AssetDetailPage -- "uses" --> apiService
style apiService fill:#bbf,stroke:#333,stroke-width:2px
sequenceDiagram
actor User
participant Page as "AssetDetailPage (Component)"
participant Chart as "AssetChart (Component)"
participant APISvc as "apiService.ts"
participant BrowserAPI as "Browser Fetch/Axios"
User->>Page: Selects "1 Year" from PeriodSelector
activate Page
Page->>Page: setState(loading: true)
Page->>APISvc: getAssetSignals("AAPL", period="1y")
activate APISvc
APISvc->>BrowserAPI: GET /api/v1/signals/AAPL?period=1y
activate BrowserAPI
Note right of BrowserAPI: Makes network request to backend.
BrowserAPI-->>APISvc: Returns Promise with JSON data
deactivate BrowserAPI
APISvc-->>Page: Returns Promise<SignalData>
deactivate APISvc
Page->>Page: setState({ data: SignalData, loading: false })
Note right of Page: React triggers a re-render because state changed.
Page->>Chart: Passes new data as props
activate Chart
Note right of Chart: Chart.js library re-renders the canvas with new price line and signal markers.
Chart-->>Page: Renders updated chart
deactivate Chart
Page-->>User: Displays updated chart with signals
deactivate Page
graph TD
subgraph "API Layer"
FastAPI_App["FastAPI App"]
Signals_Endpoint["Signals Endpoint"]
end
subgraph "Service Layer"
Signal_Service["Signal Service"]
Data_Orchestrator["Data Orchestrator"]
end
subgraph "Data Provider Layer"
yfinance_Provider["yfinance Provider"]
AlphaVantage_Provider["AlphaVantage Provider"]
end
subgraph "Data Access Layer"
Asset_CRUD["Asset CRUD"]
Database["Database (PostgreSQL)"]
end
FastAPI_App -- "routes to" --> Signals_Endpoint
Signals_Endpoint -- "uses" --> Signal_Service
Signal_Service -- "uses" --> Data_Orchestrator
Signal_Service -- "uses for asset_type lookup" --> Asset_CRUD
Data_Orchestrator -- "uses (primary)" --> yfinance_Provider
Data_Orchestrator -- "uses (fallback)" --> AlphaVantage_Provider
Asset_CRUD -- "interacts with" --> Database
style yfinance_Provider fill:#e6f3ff,stroke:#36c
style AlphaVantage_Provider fill:#e6f3ff,stroke:#36c
style Asset_CRUD fill:#f0e6ff,stroke:#639
style Database fill:#f0e6ff,stroke:#639
sequenceDiagram
participant API as "/api/v1/signals/{symbol}"
participant SigSvc as "SignalService"
participant AssetCRUD as "Asset CRUD Module"
participant DB as "Database"
participant DataOrc as "DataOrchestrator"
participant YFProv as "yfinance Provider"
API->>SigSvc: get_sma_crossover_signal("AAPL", short=20, long=50)
Note right of SigSvc: Need asset_type to fetch correct historical data.
SigSvc->>AssetCRUD: get_asset_by_symbol("AAPL")
AssetCRUD->>DB: SELECT * FROM assets WHERE symbol = 'AAPL'
DB-->>AssetCRUD: Returns Asset(asset_type='stock')
AssetCRUD-->>SigSvc: Returns Asset Model
SigSvc->>DataOrc: get_historical_data("AAPL", asset_type='stock', outputsize="full")
Note right of DataOrc: Checks Redis cache first. On miss...
DataOrc->>YFProv: fetch_yf_historical_data("AAPL", asset_type='stock', period="max")
Note right of YFProv: Makes live call to Yahoo Finance API.
YFProv-->>DataOrc: Returns List[HistoricalPricePoint]
Note right of DataOrc: Sets the result to Redis cache.
DataOrc-->>SigSvc: Returns List[HistoricalPricePoint]
Note right of SigSvc: 1. Calculates 20-day SMA.<br/>2. Calculates 50-day SMA.<br/>3. Finds crossover points.
SigSvc-->>API: Returns { historical_data, signals }