Remove data.csv file and update README to reflect new features including CSV import and banking account management. Enhance TransactionsTable and KreditorTable components with banking account handling, including UI updates and validation logic. Update SQL schema to support banking accounts and adjust API routes for improved data handling. Implement new document rendering logic for banking transactions and enhance recipient rendering with banking account status. Add new views and indexes for better transaction management.

This commit is contained in:
sebseb7
2025-08-01 13:26:26 +02:00
parent 6cde543938
commit fbfd918d81
13 changed files with 1774 additions and 1409 deletions

478
README.md
View File

@@ -1,18 +1,20 @@
# FibDash
A modern React Material-UI dashboard application with Google SSO authentication and MSSQL database integration.
A modern React Material-UI dashboard for financial reconciliation with Google SSO authentication, CSV import/analysis, DATEV export, and optional MSSQL integration with JTL tables.
## Features
- 🚀 **React 18** with class components
- 🎨 **Material-UI (MUI)** for beautiful, modern UI
- 🔐 **Google SSO Authentication**
- 🗄️ **MSSQL Database** integration
- **Webpack Dev Server** with Hot Module Reload
- 🔄 **Express API** with hot reload via nodemon
- 🛡️ **JWT Authentication** for API security
- 📱 **Responsive Design**
- 🎯 **Production Ready** build configuration
- 🚀 React 18 (class components)
- 🎨 Material-UI (MUI) UI with responsive layout
- 🔐 Google SSO (Google Identity Services) + JWT API auth
- 🗄️ MSSQL integration (optional; app runs without DB)
- 📥 CSV import of bank transactions (German MT940-like CSV)
- 🔍 Reconciliation view: CSV vs JTL (if DB available)
- 📤 DATEV export for selected month/quarter/year
- 🧩 Admin data management (Kreditor, Konto, BU)
- ⚡ Webpack dev server (HMR) + nodemon hot-reload
- 🛡️ Email allowlist authorization
- 🧰 Production single-process build (Express serves React)
## Architecture
@@ -27,337 +29,249 @@ fibdash/
│ └── public/
│ └── index.html # HTML template
├── src/ # Backend Express API
│ ├── config/ # Database configuration
│ ├── middleware/ # Authentication middleware
│ ├── routes/ # API routes
│ ├── database/ # Database schema
│ ├── config/ # Database configuration (MSSQL)
│ ├── middleware/ # Auth + email allowlist middleware
│ ├── routes/ # API routes (auth, data, admin, dashboard)
│ ├── database/ # SQL schema and CSV import schema
│ └── index.js # Express server entry point
├── webpack.config.js # Webpack configuration
├── nginx.*.conf # Nginx reverse-proxy configs (dev/prod/simple)
├── webpack*.config.js # Webpack configs
├── docker-compose.dev.yml # Optional dev docker-compose for proxying
├── data.csv # Sample CSV for local analysis
└── package.json # Dependencies and scripts
```
## Functional Overview
Authentication and Authorization
- Login: Frontend uses Google Identity Services. The backend validates the ID token and issues a JWT.
- Email allowlist: Only emails in AUTHORIZED_EMAILS or matching DB rule are allowed.
- ENV allowlist: fast path check.
- DB check: optional, queries JTL tables to verify access by attributes 219/220.
- JWT middleware guards all /api routes.
CSV Analysis and Reconciliation
- Upload or place CSV at project root (data.csv) for quick testing.
- CSV parsing: German semicolon-separated format with headers like Buchungstag, Betrag, Verwendungszweck, IBAN, etc.
- Reconciliation:
- If MSSQL available: fetch JTL transactions (tZahlungsabgleichUmsatz), related PDFs (tUmsatzBeleg, tPdfObjekt), and links; match by date+amount.
- Kreditor lookup: optional mapping via fibdash.Kreditor using IBAN; also supports banking accounts via is_banking.
- Summary totals: income, expenses, net, match counts.
DATEV Export
- Endpoint returns CSV in DATEV format for the chosen period (month, quarter, year).
- Headers and column mapping created server-side; amounts normalized; encoding served as text/csv.
Admin Management
- Kreditor: CRUD for name, kreditorId, IBAN (optional if is_banking=true).
- Konto: CRUD account numbers/names used for accounting.
- BU (Buchungsschlüssel): CRUD including optional VSt.
- System info endpoint.
Health
- /api/health returns OK and timestamp.
## API Surface (key endpoints)
Auth
- POST /api/auth/google — Google token exchange → JWT
- GET /api/auth/verify — Validate JWT, returns user
- POST /api/auth/logout — Stateless logout success
Dashboard
- GET /api/dashboard — Mock dashboard stats
- GET /api/dashboard/user — Returns current JWT user
Data and Reconciliation
- GET /api/data/months — Available months inferred from data.csv
- GET /api/data/transactions/:timeRange — Combined CSV + JTL view with summary
- timeRange supports YYYY-MM, YYYY, YYYY-Q1..Q4
- GET /api/data/datev/:timeRange — Download DATEV CSV
- GET /api/data/pdf/umsatzbeleg/:kUmsatzBeleg — Stream PDF from tUmsatzBeleg
- GET /api/data/pdf/pdfobject/:kPdfObjekt — Stream PDF from tPdfObjekt
Kreditor, Konto, BU
- GET /api/data/kreditors
- GET /api/data/kreditors/:id
- POST /api/data/kreditors
- PUT /api/data/kreditors/:id
- DELETE /api/data/kreditors/:id
- GET /api/data/assignable-kreditors — only non-banking
- BankingAccountTransactions assignments: POST/PUT/DELETE, and GET /api/data/banking-transactions/:transactionId
- Admin counterparts exist under /api/admin for Kreditor/Konto/BU CRUD.
CSV Import (to DB)
- POST /api/data/import-csv-transactions — Validates rows and inserts into fibdash.CSVTransactions
- GET /api/data/csv-transactions — Paginated list with kreditor joins and assignment info
- GET /api/data/csv-import-batches — Import batch summaries
## Frontend UX Summary
App shell
- Top AppBar with title, tabs (Dashboard, Stammdaten), current user, DATEV export button when applicable, and logout.
Login
- Button triggers Google prompt; robust error messaging for SSO, service unavailability, authorization.
Dashboard view
- Month selector, summary header, and a transactions table with reconciliation indicators (CSV-only, JTL-only, matched), PDF links when available.
Stammdaten view
- Management UI for Kreditors, Konten, and Buchungsschlüssel (class-based components under client/src/components/admin).
CSV Import
- Modal dialog with drag-and-drop or file picker, header detection, basic validation, progress, and results summary; uses /api/data/import-csv-transactions.
## Prerequisites
- Node.js (v16 or higher)
- MSSQL Server instance
- Google Cloud Platform account for OAuth setup
- Node.js (v16+)
- Optionally: MSSQL Server for JTL and fibdash schema
- Google Cloud project and OAuth 2.0 Client ID
## Setup Instructions
### 1. Clone and Install Dependencies
## Setup
1) Clone and install
```bash
git clone <your-repo-url>
cd fibdash
npm install
```
### 2. Environment Configuration
Copy the example environment file and configure your settings:
2) Environment
```bash
cp .env.example .env
# then edit .env
```
Edit `.env` with your actual configuration:
Required variables
- GOOGLE_CLIENT_ID, GOOGLE_CLIENT_SECRET
- REACT_APP_GOOGLE_CLIENT_ID (must match GOOGLE_CLIENT_ID)
- JWT_SECRET
- Optional authorization: AUTHORIZED_EMAILS=admin@company.com,user@company.com
- MSSQL: DB_SERVER, DB_DATABASE, DB_USERNAME, DB_PASSWORD, DB_PORT=1433
- Server: PORT=5000, NODE_ENV=development
```env
# Google OAuth Configuration
GOOGLE_CLIENT_ID=your_google_client_id_here
GOOGLE_CLIENT_SECRET=your_google_client_secret_here
3) Google OAuth
- Create OAuth 2.0 Web Client in Google Cloud Console.
- Authorized origins: http://localhost:5001 (dev), your domain(s).
- Authorized redirects: matching roots (e.g., http://localhost:5001/).
- See detailed guide: [docs/GOOGLE_OAUTH_SETUP.md](docs/GOOGLE_OAUTH_SETUP.md)
# Frontend Environment Variables (REACT_APP_ prefix required)
REACT_APP_GOOGLE_CLIENT_ID=your_google_client_id_here
# JWT Secret (generate a secure random string)
JWT_SECRET=your_jwt_secret_here
# MSSQL Database Configuration
DB_SERVER=your_mssql_server_here
DB_DATABASE=your_database_name_here
DB_USERNAME=your_db_username_here
DB_PASSWORD=your_db_password_here
DB_PORT=1433
# Server Configuration
PORT=5000
NODE_ENV=development
```
### 3. Google OAuth Setup
1. Go to the [Google Cloud Console](https://console.cloud.google.com/)
2. Create a new project or select an existing one
3. Enable the Google+ API
4. Create OAuth 2.0 credentials:
- Application type: Web application
- Authorized JavaScript origins: `http://localhost:3000`
- Authorized redirect URIs: `http://localhost:3000`
5. Copy the Client ID and Client Secret to your `.env` file
### 4. Database Setup
1. Create a new MSSQL database
2. Run the schema creation script:
4) Database (optional but required for JTL features)
- Create DB and run schemas:
- Core schema: see src/database/schema.sql
- CSV import table schemas if needed
- The app will run without DB; JTL features and admin CRUD will error if DB not configured.
5) Development
```bash
# Connect to your MSSQL server and run:
sqlcmd -S your_server -d your_database -i src/database/schema.sql
```
Or manually execute the SQL commands in `src/database/schema.sql`
### 5. Start Development Servers
Run both frontend and backend development servers:
```bash
npm run dev
```
This will start:
- Frontend dev server on `http://localhost:5001` (with hot reload)
- Backend API server on `http://localhost:5000` (with nodemon auto-restart)
Or run them separately:
```bash
# Frontend only
npm run dev # runs frontend at 5001 and backend at 5000
# or separately:
npm run dev:frontend
# Backend only
npm run dev:backend
```
### 6. Optional: Nginx Setup for Development
For a more production-like development environment, you can set up nginx as a reverse proxy:
#### Automatic Setup (Linux/macOS)
6) Optional Nginx for dev
- Automatic:
```bash
npm run setup:nginx
```
#### Manual Setup
1. Install nginx on your system
2. Copy the nginx configuration:
- Manual:
```bash
sudo cp nginx.simple.conf /etc/nginx/sites-available/fibdash-dev
sudo ln -s /etc/nginx/sites-available/fibdash-dev /etc/nginx/sites-enabled/
sudo nginx -t && sudo systemctl reload nginx
```
- Hosts entry (optional): 127.0.0.1 fibdash.local
3. Add to your hosts file (optional):
With nginx:
- App: http://localhost/ or http://fibdash.local/
- API: http://localhost/api/
- Direct FE: http://localhost:5001/
- Direct BE: http://localhost:5000/
## Production
Single-process model
- Express serves built React app and handles APIs and auth.
Build and run
```bash
echo "127.0.0.1 fibdash.local" | sudo tee -a /etc/hosts
npm start # build frontend and start backend
# or:
npm run build
npm run start:prod
```
With nginx setup, you can access:
- **Main app**: `http://localhost/` or `http://fibdash.local/`
- **API**: `http://localhost/api/` or `http://fibdash.local/api/`
- **Direct Frontend**: `http://localhost:5001/`
- **Direct Backend**: `http://localhost:5000/`
### Production Nginx Setup
For production deployment, use the production nginx configuration:
Nginx production reverse proxy
```bash
# Copy production nginx config
sudo cp nginx.prod.conf /etc/nginx/sites-available/fibdash-prod
sudo ln -s /etc/nginx/sites-available/fibdash-prod /etc/nginx/sites-enabled/
sudo nginx -t && sudo systemctl reload nginx
```
The production config includes:
- **Security headers** (CSP, XSS protection, etc.)
- **Static asset caching** (1 year for JS/CSS/images)
- **Gzip compression** for better performance
- **SSL/HTTPS support** (commented out, ready to enable)
Prod features:
- Static asset caching, gzip, security headers, optional TLS.
## Email Authorization
## Environment variables
FibDash includes built-in email authorization to restrict access to specific users.
- GOOGLE_CLIENT_ID, GOOGLE_CLIENT_SECRET
- REACT_APP_GOOGLE_CLIENT_ID
- JWT_SECRET
- AUTHORIZED_EMAILS (comma-separated; if unset/empty, no users can access)
- DB_SERVER, DB_DATABASE, DB_USERNAME, DB_PASSWORD, DB_PORT
- PORT, NODE_ENV
### Setup Authorization
## Data model (MSSQL)
1. **Add authorized emails to your `.env` file:**
```env
AUTHORIZED_EMAILS=admin@yourcompany.com,user1@yourcompany.com,user2@yourcompany.com
```
Core tables referenced
- eazybusiness.dbo.tZahlungsabgleichUmsatz (+ tUmsatzKontierung)
- tUmsatzBeleg (PDF storage), tPdfObjekt (PDF objects), tZahlungsabgleichUmsatzLink (links)
- fibdash.Kreditor (id, iban, name, kreditorId, is_banking)
- fibdash.Konto (id, konto, name)
- fibdash.BU (id, bu, name, vst)
- fibdash.BankingAccountTransactions (assignment records)
- fibdash.CSVTransactions (imported CSV rows)
2. **First email is admin**: The first email in the list automatically gets admin privileges
See SQL in src/database/schema.sql and src/database/csv_transactions_schema.sql for exact DDL.
3. **No authorization**: If `AUTHORIZED_EMAILS` is not set or empty, **NO USERS** can access the app
## Developer notes
### Admin Features
- Backend auto-detects DB availability. If DB env is missing, it logs a warning and continues. Endpoints that require DB will respond with 500/404 as applicable.
- data.csv at repo root is used by /api/data/months and /api/data/transactions/* for local CSV-based analysis.
- Matching logic between CSV and JTL uses exact amount and same-day date to mark matches.
- DATEV export uses well-formed header, consistent number formatting, and limited text field lengths.
Admins (first email in the authorized list) can:
- View all authorized emails: `GET /api/admin/authorized-emails`
- Add new authorized email: `POST /api/admin/authorized-emails`
- Remove authorized email: `DELETE /api/admin/authorized-emails/:email`
- View system info: `GET /api/admin/system-info`
## Scripts
**Note**: Admin changes via API are temporary. For permanent changes, update the `.env` file.
### Authorization Flow
1. User signs in with Google
2. Backend checks if email is in `AUTHORIZED_EMAILS` list
3. If authorized → login succeeds
4. If not authorized → "Access denied" error
5. All API endpoints check authorization via middleware
## Development
### Frontend Development
- Frontend code is in the `client/` directory
- Uses Webpack dev server with hot module reload
- All components are class components (no function components)
- Material-UI for styling and components
- Authentication handled via Google SSO
### Backend Development
- Backend code is in the `src/` directory
- Express.js API server with CORS enabled
- JWT authentication middleware
- MSSQL database integration
- Auto-restart on file changes via nodemon
### API Endpoints
- `POST /api/auth/google` - Google OAuth login
- `GET /api/auth/verify` - Verify JWT token
- `POST /api/auth/logout` - Logout user
- `GET /api/dashboard` - Get dashboard data
- `GET /api/dashboard/user` - Get user-specific data
- `GET /api/health` - Health check
### Available Scripts
| Script | Description |
|--------|-------------|
| `npm run dev` | Start both frontend and backend in development mode |
| `npm run dev:frontend` | Start only the frontend webpack dev server |
| `npm run dev:backend` | Start only the backend API server with nodemon |
| `npm run build` | Create production build of frontend |
| `npm run build:prod` | Build and start production server |
| `npm start` | Build frontend and start production server |
| `npm run start:prod` | Start production server (assumes build exists) |
| `npm run setup:nginx` | Automatically setup nginx for development |
| `npm run nginx:test` | Test nginx configuration |
| `npm run nginx:reload` | Reload nginx configuration |
| `npm run nginx:start` | Start nginx service |
| `npm run nginx:stop` | Stop nginx service |
| `npm run nginx:status` | Check nginx service status |
## Production Deployment
### Single-Process Production Setup
In production, the Express backend builds the React frontend and serves it as static files. No separate frontend server is needed.
#### Build and Start Production Server
```bash
# Build frontend and start server in one command
npm start
# Or build and start separately
npm run build
npm run start:prod
```
#### Production Architecture
```
Production Setup:
┌─────────────────────────────────────┐
│ Single Express Server (Port 5000) │
├─────────────────────────────────────┤
│ • Serves static React files │
│ • Handles API routes (/api/*) │
│ • Manages authentication │
│ • Connects to MSSQL database │
└─────────────────────────────────────┘
```
#### Production Features
- **Single Process**: One Node.js server handles everything
- **Static File Serving**: Built React app served with caching headers
- **Optimized Build**: Minified JS/CSS with content hashing
- **Code Splitting**: Vendor libraries separated for better caching
- **Production Security**: CSP headers and security optimizations
## Project Structure Details
### Frontend (`client/`)
- **Components**: All React components using class-based architecture
- **Services**: API communication classes
- **Material-UI**: Modern UI components with custom theming
- **Hot Reload**: Webpack dev server with HMR enabled
### Backend (`src/`)
- **Express Server**: RESTful API with middleware
- **Authentication**: Google OAuth + JWT tokens
- **Database**: MSSQL with connection pooling
- **Hot Reload**: Nodemon for automatic server restart
## Environment Variables
| Variable | Description | Required |
|----------|-------------|----------|
| `GOOGLE_CLIENT_ID` | Google OAuth Client ID (backend) | Yes |
| `GOOGLE_CLIENT_SECRET` | Google OAuth Client Secret (backend) | Yes |
| `REACT_APP_GOOGLE_CLIENT_ID` | Google OAuth Client ID (frontend) | Yes |
| `JWT_SECRET` | Secret for JWT token signing | Yes |
| `AUTHORIZED_EMAILS` | Comma-separated list of authorized email addresses | No* |
| `DB_SERVER` | MSSQL server address | Yes |
| `DB_DATABASE` | Database name | Yes |
| `DB_USERNAME` | Database username | Yes |
| `DB_PASSWORD` | Database password | Yes |
| `DB_PORT` | Database port (default: 1433) | No |
| `PORT` | API server port (default: 5000) | No |
| `NODE_ENV` | Environment mode | No |
*If `AUTHORIZED_EMAILS` is not set or empty, **NO USERS** can access the application. Only email addresses listed in `AUTHORIZED_EMAILS` can log in.
- npm run dev — run FE+BE
- npm run dev:frontend — FE only (HMR)
- npm run dev:backend — BE only (nodemon)
- npm run build — FE production build
- npm run build:prod — build and start production server
- npm start — build FE and start BE
- npm run start:prod — start BE with existing build
- npm run setup:nginx — dev nginx setup
- npm run nginx:test|reload|start|stop|status — helpers
## Troubleshooting
### Database Connection Issues
Database
- Ensure MSSQL reachable; set DB_* env; firewall allows 1433; check logs.
- Without DB, JTL and admin features wont function; CSV-only features still work.
1. Verify your MSSQL server is running and accessible
2. Check firewall settings allow connections on port 1433
3. Ensure your database credentials are correct
4. Check the server logs for detailed error messages
Google OAuth
- Ensure both GOOGLE_CLIENT_ID and REACT_APP_GOOGLE_CLIENT_ID set and equal.
- Add dev/prod origins and redirects in Google Cloud Console.
- Use HTTPS in production and ensure CSP allows Google domains.
- See docs/GOOGLE_OAUTH_SETUP.md.
### Google OAuth Issues
CORS/Headers
- Dev CORS is open. Tighten in prod behind nginx.
1. Verify your Google Client ID is correctly set
2. Check that your domain is authorized in Google Cloud Console
3. Ensure the OAuth consent screen is configured
4. Make sure you're testing on the correct domain/port
**For detailed Google OAuth troubleshooting, see: [docs/GOOGLE_OAUTH_SETUP.md](docs/GOOGLE_OAUTH_SETUP.md)**
Common fixes for CORS/GSI errors:
- Add your domain to Google Cloud Console authorized origins
- Ensure both `GOOGLE_CLIENT_ID` and `REACT_APP_GOOGLE_CLIENT_ID` are set
- Use the "Alternative Google Sign-In" button as fallback
- Consider enabling HTTPS for better OAuth compatibility
### Hot Reload Not Working
1. Check that both dev servers are running
2. Verify webpack proxy configuration
3. Clear browser cache and restart dev servers
Hot reload
- Make sure both dev servers run; clear cache; check proxy configs.
## License
ISC
ISC

View File

@@ -0,0 +1,252 @@
import React, { Component } from 'react';
import {
Box,
FormControl,
InputLabel,
Select,
MenuItem,
TextField,
Button,
Alert,
CircularProgress,
Typography,
} from '@mui/material';
import AuthService from '../services/AuthService';
class BankingKreditorSelector extends Component {
constructor(props) {
super(props);
this.state = {
assignableKreditors: [],
selectedKreditorId: '',
notes: '',
loading: false,
error: null,
saving: false,
};
this.authService = new AuthService();
}
componentDidMount() {
this.loadAssignableKreditors();
this.loadExistingAssignment();
}
componentDidUpdate(prevProps) {
// Reload data when transaction changes
if (this.props.transaction?.id !== prevProps.transaction?.id) {
this.loadExistingAssignment();
}
}
loadAssignableKreditors = async () => {
try {
this.setState({ loading: true, error: null });
const response = await this.authService.apiCall('/data/assignable-kreditors');
if (response && response.ok) {
const kreditors = await response.json();
this.setState({ assignableKreditors: kreditors, loading: false });
} else {
this.setState({
error: 'Fehler beim Laden der verfügbaren Kreditoren',
loading: false
});
}
} catch (error) {
console.error('Error loading assignable kreditors:', error);
this.setState({
error: 'Fehler beim Laden der verfügbaren Kreditoren',
loading: false
});
}
};
loadExistingAssignment = async () => {
// For CSV transactions, we need to use csv_transaction_id instead of transaction_id
const transactionId = this.props.transaction?.id || this.props.transaction?.csv_id;
if (!transactionId) return;
try {
const response = await this.authService.apiCall(
`/data/banking-transactions/${transactionId}`
);
if (response && response.ok) {
const assignments = await response.json();
if (assignments.length > 0) {
const assignment = assignments[0];
this.setState({
selectedKreditorId: assignment.assigned_kreditor_id || '',
notes: assignment.notes || '',
});
}
}
} catch (error) {
console.error('Error loading existing assignment:', error);
// Don't show error for missing assignments - it's normal
}
};
handleKreditorChange = (event) => {
this.setState({ selectedKreditorId: event.target.value });
};
handleNotesChange = (event) => {
this.setState({ notes: event.target.value });
};
handleSave = async () => {
const { transaction, user, onSave } = this.props;
const { selectedKreditorId, notes } = this.state;
if (!selectedKreditorId) {
this.setState({ error: 'Bitte wählen Sie einen Kreditor aus' });
return;
}
this.setState({ saving: true, error: null });
try {
// Check if assignment already exists
const checkResponse = await this.authService.apiCall(
`/data/banking-transactions/${transaction.id}`
);
let response;
if (checkResponse && checkResponse.ok) {
const existingAssignments = await checkResponse.json();
if (existingAssignments.length > 0) {
// Update existing assignment
response = await this.authService.apiCall(
`/data/banking-transactions/${existingAssignments[0].id}`,
'PUT',
{
assigned_kreditor_id: parseInt(selectedKreditorId),
notes: notes.trim() || null,
assigned_by: user?.username || 'Unknown',
}
);
} else {
// Create new assignment
response = await this.authService.apiCall(
'/data/banking-transactions',
'POST',
{
transaction_id: transaction.id || null,
csv_transaction_id: transaction.csv_id || transaction.id || null,
banking_iban: transaction['Kontonummer/IBAN'] || transaction.kontonummer_iban,
assigned_kreditor_id: parseInt(selectedKreditorId),
notes: notes.trim() || null,
assigned_by: user?.username || 'Unknown',
}
);
}
}
if (response && response.ok) {
this.setState({ saving: false });
if (onSave) {
onSave();
}
} else {
const errorData = await response.json();
this.setState({
error: errorData.error || 'Fehler beim Speichern der Zuordnung',
saving: false
});
}
} catch (error) {
console.error('Error saving kreditor assignment:', error);
this.setState({
error: 'Fehler beim Speichern der Zuordnung',
saving: false
});
}
};
render() {
const {
assignableKreditors,
selectedKreditorId,
notes,
loading,
error,
saving
} = this.state;
if (loading) {
return (
<Box display="flex" justifyContent="center" py={2}>
<CircularProgress size={20} />
<Typography variant="caption" sx={{ ml: 1 }}>
Lade Kreditoren...
</Typography>
</Box>
);
}
return (
<Box>
{error && (
<Alert severity="error" sx={{ mb: 2 }}>
{error}
</Alert>
)}
<FormControl fullWidth sx={{ mb: 2 }} size="small">
<InputLabel id="kreditor-select-label">
Kreditor auswählen *
</InputLabel>
<Select
labelId="kreditor-select-label"
value={selectedKreditorId}
onChange={this.handleKreditorChange}
label="Kreditor auswählen *"
>
{assignableKreditors.map((kreditor) => (
<MenuItem key={kreditor.id} value={kreditor.id}>
{kreditor.name} ({kreditor.kreditorId})
</MenuItem>
))}
</Select>
</FormControl>
<TextField
fullWidth
label="Notizen (optional)"
multiline
rows={2}
value={notes}
onChange={this.handleNotesChange}
placeholder="Zusätzliche Informationen..."
sx={{ mb: 2 }}
size="small"
/>
<Button
onClick={this.handleSave}
variant="contained"
disabled={!selectedKreditorId || saving}
size="small"
sx={{
bgcolor: '#ff5722',
'&:hover': { bgcolor: '#e64a19' }
}}
>
{saving ? (
<>
<CircularProgress size={16} sx={{ mr: 1 }} />
Speichern...
</>
) : (
'Kreditor zuordnen'
)}
</Button>
</Box>
);
}
}
export default BankingKreditorSelector;

View File

@@ -0,0 +1,350 @@
import React, { Component } from 'react';
import {
Dialog,
DialogTitle,
DialogContent,
DialogActions,
Button,
Typography,
Box,
Alert,
CircularProgress,
LinearProgress,
Chip,
} from '@mui/material';
import {
CloudUpload as UploadIcon,
CheckCircle as SuccessIcon,
Error as ErrorIcon,
} from '@mui/icons-material';
import AuthService from '../services/AuthService';
class CSVImportDialog extends Component {
constructor(props) {
super(props);
this.state = {
file: null,
csvData: null,
headers: null,
importing: false,
imported: false,
importResult: null,
error: null,
dragOver: false,
};
this.authService = new AuthService();
this.fileInputRef = React.createRef();
}
handleFileSelect = (event) => {
const file = event.target.files[0];
if (file) {
this.processFile(file);
}
};
handleDrop = (event) => {
event.preventDefault();
this.setState({ dragOver: false });
const file = event.dataTransfer.files[0];
if (file) {
this.processFile(file);
}
};
handleDragOver = (event) => {
event.preventDefault();
this.setState({ dragOver: true });
};
handleDragLeave = () => {
this.setState({ dragOver: false });
};
processFile = (file) => {
if (!file.name.toLowerCase().endsWith('.csv')) {
this.setState({ error: 'Bitte wählen Sie eine CSV-Datei aus' });
return;
}
this.setState({ file, error: null, csvData: null, headers: null });
const reader = new FileReader();
reader.onload = (e) => {
try {
const text = e.target.result;
const lines = text.split('\n').filter(line => line.trim());
if (lines.length < 2) {
this.setState({ error: 'CSV-Datei muss mindestens eine Kopfzeile und eine Datenzeile enthalten' });
return;
}
// Parse CSV (simple parsing - assumes semicolon separator and quoted fields)
const parseCSVLine = (line) => {
const result = [];
let current = '';
let inQuotes = false;
for (let i = 0; i < line.length; i++) {
const char = line[i];
if (char === '"') {
inQuotes = !inQuotes;
} else if (char === ';' && !inQuotes) {
result.push(current.trim());
current = '';
} else {
current += char;
}
}
result.push(current.trim());
return result;
};
const headers = parseCSVLine(lines[0]);
const dataRows = lines.slice(1).map(line => {
const values = parseCSVLine(line);
const row = {};
headers.forEach((header, index) => {
row[header] = values[index] || '';
});
return row;
});
this.setState({
csvData: dataRows,
headers,
error: null
});
} catch (error) {
console.error('Error parsing CSV:', error);
this.setState({ error: 'Fehler beim Lesen der CSV-Datei' });
}
};
reader.readAsText(file, 'UTF-8');
};
handleImport = async () => {
const { csvData, headers, file } = this.state;
if (!csvData || csvData.length === 0) {
this.setState({ error: 'Keine Daten zum Importieren gefunden' });
return;
}
this.setState({ importing: true, error: null });
try {
const response = await this.authService.apiCall('/data/import-csv-transactions', 'POST', {
transactions: csvData,
headers: headers,
filename: file.name,
batchId: `import_${Date.now()}_${file.name}`
});
if (response && response.ok) {
const result = await response.json();
this.setState({
importing: false,
imported: true,
importResult: result
});
if (this.props.onImportSuccess) {
this.props.onImportSuccess(result);
}
} else {
const errorData = await response.json();
this.setState({
importing: false,
error: errorData.error || 'Import fehlgeschlagen'
});
}
} catch (error) {
console.error('Import error:', error);
this.setState({
importing: false,
error: 'Netzwerkfehler beim Import'
});
}
};
handleClose = () => {
this.setState({
file: null,
csvData: null,
headers: null,
importing: false,
imported: false,
importResult: null,
error: null,
});
if (this.props.onClose) {
this.props.onClose();
}
};
render() {
const { open } = this.props;
const {
file,
csvData,
headers,
importing,
imported,
importResult,
error,
dragOver
} = this.state;
return (
<Dialog
open={open}
onClose={!importing ? this.handleClose : undefined}
maxWidth="md"
fullWidth
>
<DialogTitle>
CSV Transaktionen Importieren
</DialogTitle>
<DialogContent>
{!imported ? (
<>
{/* File Upload Area */}
<Box
sx={{
border: '2px dashed',
borderColor: dragOver ? 'primary.main' : 'grey.300',
borderRadius: 2,
p: 4,
textAlign: 'center',
bgcolor: dragOver ? 'action.hover' : 'background.paper',
cursor: 'pointer',
mb: 2,
}}
onDrop={this.handleDrop}
onDragOver={this.handleDragOver}
onDragLeave={this.handleDragLeave}
onClick={() => this.fileInputRef.current?.click()}
>
<input
type="file"
accept=".csv"
onChange={this.handleFileSelect}
ref={this.fileInputRef}
style={{ display: 'none' }}
/>
<UploadIcon sx={{ fontSize: 48, color: 'grey.400', mb: 2 }} />
<Typography variant="h6" gutterBottom>
CSV-Datei hier ablegen oder klicken zum Auswählen
</Typography>
<Typography variant="body2" color="textSecondary">
Unterstützte Formate: .csv (Semikolon-getrennt)
</Typography>
</Box>
{file && (
<Box sx={{ mb: 2 }}>
<Typography variant="subtitle2" gutterBottom>
Ausgewählte Datei:
</Typography>
<Chip label={file.name} color="primary" />
</Box>
)}
{headers && (
<Box sx={{ mb: 2 }}>
<Typography variant="subtitle2" gutterBottom>
Erkannte Spalten ({headers.length}):
</Typography>
<Box sx={{ display: 'flex', flexWrap: 'wrap', gap: 0.5 }}>
{headers.slice(0, 10).map((header, index) => (
<Chip key={index} label={header} size="small" variant="outlined" />
))}
{headers.length > 10 && (
<Chip label={`+${headers.length - 10} weitere`} size="small" />
)}
</Box>
</Box>
)}
{csvData && (
<Box sx={{ mb: 2 }}>
<Typography variant="subtitle2" gutterBottom>
Gefundene Transaktionen: {csvData.length}
</Typography>
<Typography variant="body2" color="textSecondary">
Die Daten werden validiert und in die Datenbank importiert.
</Typography>
</Box>
)}
{error && (
<Alert severity="error" sx={{ mb: 2 }}>
{error}
</Alert>
)}
{importing && (
<Box sx={{ mb: 2 }}>
<LinearProgress />
<Typography variant="body2" sx={{ mt: 1, textAlign: 'center' }}>
Importiere Transaktionen...
</Typography>
</Box>
)}
</>
) : (
/* Import Success */
<Box sx={{ textAlign: 'center', py: 2 }}>
<SuccessIcon sx={{ fontSize: 64, color: 'success.main', mb: 2 }} />
<Typography variant="h6" gutterBottom>
Import erfolgreich abgeschlossen!
</Typography>
{importResult && (
<Box sx={{ mt: 2 }}>
<Typography variant="body1" gutterBottom>
<strong>Importiert:</strong> {importResult.imported} Transaktionen
</Typography>
{importResult.errors > 0 && (
<Typography variant="body1" color="warning.main">
<strong>Fehler:</strong> {importResult.errors} Zeilen übersprungen
</Typography>
)}
<Typography variant="body2" color="textSecondary" sx={{ mt: 1 }}>
Batch-ID: {importResult.batchId}
</Typography>
</Box>
)}
</Box>
)}
</DialogContent>
<DialogActions>
<Button onClick={this.handleClose} disabled={importing}>
{imported ? 'Schließen' : 'Abbrechen'}
</Button>
{!imported && csvData && (
<Button
onClick={this.handleImport}
variant="contained"
disabled={importing || !csvData}
startIcon={importing ? <CircularProgress size={16} /> : <UploadIcon />}
>
{importing ? 'Importiere...' : 'Importieren'}
</Button>
)}
</DialogActions>
</Dialog>
);
}
}
export default CSVImportDialog;

View File

@@ -16,6 +16,7 @@ import { Clear as ClearIcon } from '@mui/icons-material';
import { getColumnDefs, defaultColDef, gridOptions } from './config/gridConfig';
import { processTransactionData, getRowStyle, getRowClass, getSelectedDisplayName } from './utils/dataUtils';
class TransactionsTable extends Component {
constructor(props) {
super(props);
@@ -282,7 +283,7 @@ class TransactionsTable extends Component {
console.log('Selected rows:', Array.from(selectedRows));
};
render() {
const { selectedMonth, loading } = this.props;
@@ -334,6 +335,7 @@ class TransactionsTable extends Component {
selectedRows: this.state.selectedRows,
onSelectionChange: this.onSelectionChange,
onSelectAll: this.onSelectAll,
totalRows: this.state.totalRows,
displayedRows: this.state.displayedRows
}}
@@ -479,6 +481,8 @@ class TransactionsTable extends Component {
</Tooltip>
)}
</Box>
</Paper>
);
}

View File

@@ -18,6 +18,8 @@ import {
Typography,
Alert,
CircularProgress,
Checkbox,
FormControlLabel,
} from '@mui/material';
import {
Add as AddIcon,
@@ -41,6 +43,7 @@ class KreditorTable extends Component {
iban: '',
name: '',
kreditorId: '',
is_banking: false,
},
};
this.authService = new AuthService();
@@ -78,13 +81,15 @@ class KreditorTable extends Component {
dialogOpen: true,
editingKreditor: kreditor,
formData: kreditor ? {
iban: kreditor.iban,
iban: kreditor.iban || '',
name: kreditor.name,
kreditorId: kreditor.kreditorId,
is_banking: Boolean(kreditor.is_banking),
} : {
iban: '',
name: '',
kreditorId: '',
is_banking: false,
},
});
};
@@ -97,6 +102,7 @@ class KreditorTable extends Component {
iban: '',
name: '',
kreditorId: '',
is_banking: false,
},
});
@@ -117,11 +123,24 @@ class KreditorTable extends Component {
});
};
handleCheckboxChange = (field) => (event) => {
this.setState({
formData: {
...this.state.formData,
[field]: event.target.checked,
},
});
};
isFormValid = () => {
const { formData } = this.state;
return formData.iban.trim() !== '' &&
formData.name.trim() !== '' &&
formData.kreditorId.trim() !== '';
// Name and kreditorId are always required
const basicFieldsValid = formData.name.trim() !== '' && formData.kreditorId.trim() !== '';
// IBAN is optional for banking accounts, required for regular kreditors
const ibanValid = formData.is_banking || formData.iban.trim() !== '';
return basicFieldsValid && ibanValid;
};
handleSave = async () => {
@@ -244,6 +263,7 @@ class KreditorTable extends Component {
<TableCell>Kreditor ID</TableCell>
<TableCell>Name</TableCell>
<TableCell>IBAN</TableCell>
<TableCell>Typ</TableCell>
<TableCell align="right">Aktionen</TableCell>
</TableRow>
</TableHead>
@@ -252,7 +272,18 @@ class KreditorTable extends Component {
<TableRow key={kreditor.id}>
<TableCell>{kreditor.kreditorId}</TableCell>
<TableCell>{kreditor.name}</TableCell>
<TableCell>{kreditor.iban}</TableCell>
<TableCell style={{
color: kreditor.is_banking ? '#ff5722' : 'inherit',
fontWeight: kreditor.is_banking ? 'bold' : 'normal'
}}>
{kreditor.iban || 'Keine IBAN'}
</TableCell>
<TableCell>
{kreditor.is_banking ?
<span style={{ color: '#ff5722', fontWeight: 'bold' }}>Banking</span> :
'Kreditor'
}
</TableCell>
<TableCell align="right">
<IconButton
size="small"
@@ -316,6 +347,18 @@ class KreditorTable extends Component {
variant="outlined"
value={formData.iban}
onChange={this.handleInputChange('iban')}
helperText={formData.is_banking ? "IBAN ist optional für Banking-Konten" : ""}
sx={{ mb: 2 }}
/>
<FormControlLabel
control={
<Checkbox
checked={formData.is_banking}
onChange={this.handleCheckboxChange('is_banking')}
color="primary"
/>
}
label="Banking-Konto (z.B. PayPal) - benötigt manuelle Kreditor-Zuordnung"
/>
</DialogContent>
<DialogActions>

View File

@@ -28,6 +28,7 @@ import {
} from '@mui/icons-material';
import { AgGridReact } from 'ag-grid-react';
import KreditorSelector from '../KreditorSelector';
import BankingKreditorSelector from '../BankingKreditorSelector';
const DocumentRenderer = (params) => {
// Check for pdfs and links regardless of transaction source
@@ -504,12 +505,22 @@ const DocumentRenderer = (params) => {
</Box>
) : params.data.hasKreditor ? (
<Box>
<Chip
label="Kreditor gefunden"
color="success"
size="small"
sx={{ mb: 2 }}
/>
{!params.data.kreditor?.is_banking && (
<Chip
label="Kreditor gefunden"
color="success"
size="small"
sx={{ mb: 2 }}
/>
)}
{params.data.kreditor?.is_banking && (
<Chip
label="Banking-Konto erkannt"
color="warning"
size="small"
sx={{ mb: 2 }}
/>
)}
<Box sx={{ mt: 2 }}>
<Typography variant="subtitle2" gutterBottom>
Kreditor Details
@@ -520,7 +531,56 @@ const DocumentRenderer = (params) => {
<Typography variant="body2">
<strong>Kreditor ID:</strong> {params.data.kreditor.kreditorId}
</Typography>
<Typography variant="body2">
<strong>Typ:</strong> {params.data.kreditor.is_banking ? 'Banking-Konto' : 'Kreditor'}
</Typography>
</Box>
{/* Banking Account Assignment Section */}
{params.data.kreditor.is_banking && (
<Box sx={{ mt: 3, p: 2, bgcolor: '#fff3e0', borderRadius: 1, border: '1px solid #ff9800' }}>
<Typography variant="subtitle2" gutterBottom sx={{ color: '#ff5722', fontWeight: 'bold' }}>
🏦 Banking-Konto Zuordnung
</Typography>
<Typography variant="body2" sx={{ mb: 2, color: '#666' }}>
Dieses IBAN ist ein Banking-Konto (z.B. PayPal). Transaktionen müssen einem echten Kreditor zugeordnet werden.
</Typography>
{/* Show current assignment or assignment form */}
{params.data.assignedKreditor ? (
<Box sx={{ p: 2, bgcolor: '#e8f5e8', borderRadius: 1, mb: 2 }}>
<Typography variant="body2" sx={{ color: '#2e7d32', fontWeight: 'bold' }}>
Zugeordnet zu: {params.data.assignedKreditor.name}
</Typography>
<Typography variant="caption" sx={{ color: '#666' }}>
Kreditor ID: {params.data.assignedKreditor.kreditorId}
</Typography>
</Box>
) : (
<Box>
<Typography variant="body2" sx={{ mb: 2, color: '#ff5722', fontWeight: 'bold' }}>
Keine Zuordnung - Bitte Kreditor zuweisen
</Typography>
<Typography variant="caption" sx={{ color: '#666', mb: 2, display: 'block' }}>
Wählen Sie den echten Kreditor für diese Banking-Transaktion aus:
</Typography>
<BankingKreditorSelector
transaction={params.data}
user={params.context?.user}
onSave={() => {
// Refresh the grid to show updated assignment
if (params.api) {
params.api.refreshCells({
columns: ['Kontonummer/IBAN'],
force: true
});
}
}}
/>
</Box>
)}
</Box>
)}
</Box>
) : (
<Box>

View File

@@ -9,7 +9,7 @@ const RecipientRenderer = (params) => {
// Stop event propagation to prevent row selection
event.stopPropagation();
// Apply filter to IBAN column using the custom IbanSelectionFilter format
// Default behavior: Apply filter to IBAN column
const currentFilterModel = params.api.getFilterModel();
params.api.setFilterModel({
...currentFilterModel,
@@ -25,9 +25,20 @@ const RecipientRenderer = (params) => {
const getIbanColor = () => {
if (!isIbanColumn || !value) return 'inherit';
// Check if this transaction has Kreditor information
if (params.data && params.data.hasKreditor) {
return '#2e7d32'; // Green for found Kreditor
// Check if the kreditor is a banking account
if (params.data.kreditor?.is_banking) {
// Check if banking transaction has assigned kreditor
if (params.data.assignedKreditor) {
return '#00e676'; // Bright neon green for banking account with assigned kreditor
} else {
return '#ff5722'; // Red-orange for banking account needing assignment
}
} else {
return '#2e7d32'; // Dark green for regular kreditor
}
} else if (params.data && value) {
return '#ed6c02'; // Orange for IBAN without Kreditor
}
@@ -39,7 +50,15 @@ const RecipientRenderer = (params) => {
if (!isIbanColumn || !value) return undefined;
if (params.data && params.data.hasKreditor) {
return `IBAN "${value}" - Kreditor: ${params.data.kreditor?.name || 'Unbekannt'} (zum Filtern klicken)`;
if (params.data.kreditor?.is_banking) {
if (params.data.assignedKreditor) {
return `Banking-IBAN "${value}" - Zugeordnet zu: ${params.data.assignedKreditor.name} (zum Filtern klicken)`;
} else {
return `Banking-IBAN "${value}" - BENÖTIGT KREDITOR-ZUORDNUNG (zum Filtern klicken)`;
}
} else {
return `IBAN "${value}" - Kreditor: ${params.data.kreditor?.name || 'Unbekannt'} (zum Filtern klicken)`;
}
} else if (params.data && value) {
return `IBAN "${value}" - Kein Kreditor gefunden (zum Filtern klicken)`;
}

View File

@@ -149,8 +149,12 @@ class KreditorService {
validateKreditorData(kreditorData) {
const errors = [];
if (!kreditorData.iban || kreditorData.iban.trim() === '') {
errors.push('IBAN ist erforderlich');
// IBAN is only required for non-banking accounts that are not manual assignments
const isBanking = kreditorData.is_banking || false;
const hasIban = kreditorData.iban && kreditorData.iban.trim() !== '';
if (!isBanking && !hasIban) {
errors.push('IBAN ist erforderlich (außer für Banking-Konten oder manuelle Zuordnungen)');
}
if (!kreditorData.name || kreditorData.name.trim() === '') {
@@ -161,14 +165,20 @@ class KreditorService {
errors.push('Kreditor-ID ist erforderlich');
}
// Basic IBAN format validation (simplified)
if (kreditorData.iban && !/^[A-Z]{2}[0-9]{2}[A-Z0-9]{4}[0-9]{7}([A-Z0-9]?){0,16}$/i.test(kreditorData.iban.replace(/\s/g, ''))) {
// Basic IBAN format validation (simplified) - only if IBAN is provided
if (hasIban && !/^[A-Z]{2}[0-9]{2}[A-Z0-9]{4}[0-9]{7}([A-Z0-9]?){0,16}$/i.test(kreditorData.iban.replace(/\s/g, ''))) {
errors.push('IBAN Format ist ungültig');
}
// Validate kreditorId format (should start with 70xxx)
if (kreditorData.kreditorId && !/^70\d{3,}$/.test(kreditorData.kreditorId)) {
errors.push('Kreditor-ID muss mit 70 beginnen gefolgt von mindestens 3 Ziffern');
// Validate kreditorId format (should start with 70xxx for regular kreditors)
if (kreditorData.kreditorId && !isBanking && !/^70\d{3,}$/.test(kreditorData.kreditorId)) {
errors.push('Kreditor-ID muss mit 70 beginnen gefolgt von mindestens 3 Ziffern (außer für Banking-Konten)');
}
// For banking accounts, warn about special handling
if (isBanking && hasIban) {
// This is just informational, not an error
console.info('Banking-Konto erkannt: Transaktionen benötigen manuelle Kreditor-Zuordnung');
}
return errors;

1050
data.csv

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,124 @@
-- CSV Transactions Import Schema
-- This script creates a table to store imported CSV transaction data
-- Create CSVTransactions table to store imported CSV data
CREATE TABLE fibdash.CSVTransactions (
id INT IDENTITY(1,1) PRIMARY KEY,
-- Original CSV columns (German names as they appear in CSV)
buchungstag NVARCHAR(50), -- "Buchungstag"
wertstellung NVARCHAR(50), -- "Wertstellung"
umsatzart NVARCHAR(100), -- "Umsatzart"
betrag DECIMAL(15,2), -- "Betrag" (numeric value)
betrag_original NVARCHAR(50), -- Original string from CSV
waehrung NVARCHAR(10), -- "Waehrung"
beguenstigter_zahlungspflichtiger NVARCHAR(500), -- "Beguenstigter/Zahlungspflichtiger"
kontonummer_iban NVARCHAR(50), -- "Kontonummer/IBAN"
bic NVARCHAR(20), -- "BIC"
verwendungszweck NVARCHAR(1000), -- "Verwendungszweck"
-- Processed/computed fields
parsed_date DATE, -- Parsed buchungstag
numeric_amount DECIMAL(15,2), -- Processed amount
-- Import metadata
import_date DATETIME2 NOT NULL DEFAULT GETDATE(),
import_batch_id NVARCHAR(100), -- To group imports from same file
source_filename NVARCHAR(255), -- Original CSV filename
source_row_number INT, -- Row number in original CSV
-- Processing status
is_processed BIT NOT NULL DEFAULT 0, -- Whether this transaction has been processed
processing_notes NVARCHAR(500), -- Any processing notes or errors
-- Create indexes for performance
INDEX IX_CSVTransactions_IBAN (kontonummer_iban),
INDEX IX_CSVTransactions_Date (parsed_date),
INDEX IX_CSVTransactions_Amount (numeric_amount),
INDEX IX_CSVTransactions_ImportBatch (import_batch_id),
INDEX IX_CSVTransactions_Processed (is_processed)
);
-- Update BankingAccountTransactions to reference CSVTransactions
-- Add a new column to support both AccountingItems and CSVTransactions
ALTER TABLE fibdash.BankingAccountTransactions
ADD csv_transaction_id INT NULL;
-- Add foreign key constraint
ALTER TABLE fibdash.BankingAccountTransactions
ADD CONSTRAINT FK_BankingAccountTransactions_CSVTransactions
FOREIGN KEY (csv_transaction_id) REFERENCES fibdash.CSVTransactions(id);
-- Create index for the new column
CREATE INDEX IX_BankingAccountTransactions_CSVTransactionId
ON fibdash.BankingAccountTransactions(csv_transaction_id);
-- Update the view to include CSV transactions
DROP VIEW IF EXISTS fibdash.vw_TransactionsWithKreditors;
GO
CREATE VIEW fibdash.vw_TransactionsWithKreditors AS
-- AccountingItems transactions
SELECT
'AccountingItems' as source_table,
ai.id as transaction_id,
NULL as csv_transaction_id,
ai.umsatz_brutto as amount,
ai.buchungsdatum as transaction_date,
NULL as kontonummer_iban, -- AccountingItems uses gegenkonto
ai.buchungstext as description,
k.name as kreditor_name,
k.kreditorId as kreditor_id,
k.is_banking as kreditor_is_banking,
bat.assigned_kreditor_id,
ak.name as assigned_kreditor_name,
ak.kreditorId as assigned_kreditor_id_code,
bat.assigned_date,
bat.notes as assignment_notes,
CASE
WHEN k.is_banking = 1 AND bat.assigned_kreditor_id IS NOT NULL THEN 'banking_assigned'
WHEN k.is_banking = 1 AND bat.assigned_kreditor_id IS NULL THEN 'banking_unassigned'
WHEN k.is_banking = 0 THEN 'regular_kreditor'
ELSE 'no_kreditor'
END as transaction_type
FROM fibdash.AccountingItems ai
LEFT JOIN fibdash.Kreditor k ON ai.gegenkonto = k.kreditorId
LEFT JOIN fibdash.BankingAccountTransactions bat ON ai.id = bat.transaction_id
LEFT JOIN fibdash.Kreditor ak ON bat.assigned_kreditor_id = ak.id
UNION ALL
-- CSV transactions
SELECT
'CSVTransactions' as source_table,
NULL as transaction_id,
csv.id as csv_transaction_id,
csv.numeric_amount as amount,
csv.parsed_date as transaction_date,
csv.kontonummer_iban,
csv.verwendungszweck as description,
k.name as kreditor_name,
k.kreditorId as kreditor_id,
k.is_banking as kreditor_is_banking,
bat.assigned_kreditor_id,
ak.name as assigned_kreditor_name,
ak.kreditorId as assigned_kreditor_id_code,
bat.assigned_date,
bat.notes as assignment_notes,
CASE
WHEN k.is_banking = 1 AND bat.assigned_kreditor_id IS NOT NULL THEN 'banking_assigned'
WHEN k.is_banking = 1 AND bat.assigned_kreditor_id IS NULL THEN 'banking_unassigned'
WHEN k.is_banking = 0 THEN 'regular_kreditor'
ELSE 'no_kreditor'
END as transaction_type
FROM fibdash.CSVTransactions csv
LEFT JOIN fibdash.Kreditor k ON csv.kontonummer_iban = k.iban
LEFT JOIN fibdash.BankingAccountTransactions bat ON csv.id = bat.csv_transaction_id
LEFT JOIN fibdash.Kreditor ak ON bat.assigned_kreditor_id = ak.id;
GO
PRINT 'CSV Transactions schema created successfully!';
PRINT 'Created CSVTransactions table';
PRINT 'Updated BankingAccountTransactions table';
PRINT 'Updated vw_TransactionsWithKreditors view';

View File

@@ -9,17 +9,20 @@ GO
-- Create Kreditor table
-- Multiple IBANs can have the same kreditor name and kreditorId
-- IBAN can be NULL for Kreditors that don't have an IBAN (for banking account assignments)
-- is_banking flag indicates if this IBAN represents a banking account (like PayPal) rather than a direct creditor
CREATE TABLE fibdash.Kreditor (
id INT IDENTITY(1,1) PRIMARY KEY,
iban NVARCHAR(34) NOT NULL,
iban NVARCHAR(34) NULL, -- Nullable to allow Kreditors without IBAN
name NVARCHAR(255) NOT NULL,
kreditorId NVARCHAR(50) NOT NULL
kreditorId NVARCHAR(50) NOT NULL,
is_banking BIT NOT NULL DEFAULT 0 -- 1 = banking account, 0 = regular creditor
);
-- Create unique index on IBAN to prevent duplicate IBANs
-- but allow same kreditorId and name for multiple IBANs
ALTER TABLE fibdash.Kreditor
ADD CONSTRAINT UQ_Kreditor_IBAN UNIQUE (iban);
-- Create unique index on IBAN to prevent duplicate IBANs (allows NULL values)
CREATE UNIQUE INDEX UQ_Kreditor_IBAN_NotNull
ON fibdash.Kreditor(iban)
WHERE iban IS NOT NULL;
-- Create AccountingItems table
-- Based on CSV structure: umsatz brutto, soll/haben kz, konto, gegenkonto, bu, buchungsdatum, rechnungsnummer, buchungstext, beleglink
@@ -86,10 +89,14 @@ CSV
-- Create indexes for better performance
CREATE INDEX IX_Kreditor_IBAN ON fibdash.Kreditor(iban);
CREATE INDEX IX_Kreditor_KreditorId ON fibdash.Kreditor(kreditorId);
CREATE INDEX IX_Kreditor_IsBanking ON fibdash.Kreditor(is_banking);
CREATE INDEX IX_AccountingItems_Buchungsdatum ON fibdash.AccountingItems(buchungsdatum);
CREATE INDEX IX_AccountingItems_Konto ON fibdash.AccountingItems(konto);
CREATE INDEX IX_AccountingItems_Rechnungsnummer ON fibdash.AccountingItems(rechnungsnummer);
CREATE INDEX IX_AccountingItems_SollHabenKz ON fibdash.AccountingItems(soll_haben_kz);
CREATE INDEX IX_BankingAccountTransactions_TransactionId ON fibdash.BankingAccountTransactions(transaction_id);
CREATE INDEX IX_BankingAccountTransactions_BankingIban ON fibdash.BankingAccountTransactions(banking_iban);
CREATE INDEX IX_BankingAccountTransactions_AssignedKreditorId ON fibdash.BankingAccountTransactions(assigned_kreditor_id);
-- Add FK from AccountingItems.bu -> BU(bu)
ALTER TABLE fibdash.AccountingItems
@@ -106,6 +113,25 @@ ALTER TABLE fibdash.AccountingItems
ADD CONSTRAINT FK_AccountingItems_Konto_Konto
FOREIGN KEY (konto) REFERENCES fibdash.Konto(konto);
-- Create BankingAccountTransactions table to map banking account transactions to Kreditors
-- This table handles cases where an IBAN is a banking account (like PayPal) and needs
-- to be mapped to the actual creditor for accounting purposes
CREATE TABLE fibdash.BankingAccountTransactions (
id INT IDENTITY(1,1) PRIMARY KEY,
transaction_id INT NOT NULL, -- References AccountingItems.id
banking_iban NVARCHAR(34) NOT NULL, -- The banking account IBAN (e.g., PayPal)
assigned_kreditor_id INT NOT NULL, -- References Kreditor.id for the actual creditor
assigned_date DATETIME2 NOT NULL DEFAULT GETDATE(),
assigned_by NVARCHAR(100), -- User who made the assignment
notes NVARCHAR(500), -- Optional notes about the assignment
-- Foreign key constraints
CONSTRAINT FK_BankingAccountTransactions_AccountingItems
FOREIGN KEY (transaction_id) REFERENCES fibdash.AccountingItems(id),
CONSTRAINT FK_BankingAccountTransactions_Kreditor
FOREIGN KEY (assigned_kreditor_id) REFERENCES fibdash.Kreditor(id)
);
-- Add vst column to existing BU table (for databases created before this update)
-- IF NOT EXISTS (SELECT * FROM sys.columns WHERE object_id = OBJECT_ID('fibdash.BU') AND name = 'vst')
-- BEGIN
@@ -124,4 +150,28 @@ FOREIGN KEY (konto) REFERENCES fibdash.Konto(konto);
-- ('9', '19% VST', 19.00),
-- ('8', '7% VST', 7.00),
-- ('506', 'Dienstleistung aus EU', NULL),
-- ('511', 'Dienstleistung außerhalb EU', NULL);
-- ('511', 'Dienstleistung außerhalb EU', NULL);
-- Create view to easily query transactions with their assigned Kreditors
-- This view combines regular transactions with banking account assignments
CREATE VIEW fibdash.vw_TransactionsWithKreditors AS
SELECT
ai.*,
k.name as kreditor_name,
k.kreditorId as kreditor_id,
k.is_banking as kreditor_is_banking,
bat.assigned_kreditor_id,
ak.name as assigned_kreditor_name,
ak.kreditorId as assigned_kreditor_id_code,
bat.assigned_date,
bat.notes as assignment_notes,
CASE
WHEN k.is_banking = 1 AND bat.assigned_kreditor_id IS NOT NULL THEN 'banking_assigned'
WHEN k.is_banking = 1 AND bat.assigned_kreditor_id IS NULL THEN 'banking_unassigned'
WHEN k.is_banking = 0 THEN 'regular_kreditor'
ELSE 'no_kreditor'
END as transaction_type
FROM fibdash.AccountingItems ai
LEFT JOIN fibdash.Kreditor k ON ai.gegenkonto = k.kreditorId
LEFT JOIN fibdash.BankingAccountTransactions bat ON ai.id = bat.transaction_id
LEFT JOIN fibdash.Kreditor ak ON bat.assigned_kreditor_id = ak.id;

View File

@@ -22,7 +22,7 @@ router.get('/system-info', authenticateToken, (req, res) => {
// Get all kreditoren
router.get('/kreditoren', authenticateToken, async (req, res) => {
try {
const result = await executeQuery('SELECT id, iban, name, kreditorId FROM fibdash.Kreditor ORDER BY name, iban');
const result = await executeQuery('SELECT id, iban, name, kreditorId, is_banking FROM fibdash.Kreditor ORDER BY name, iban');
res.json({ kreditoren: result.recordset });
} catch (error) {
console.error('Error fetching kreditoren:', error);
@@ -32,22 +32,30 @@ router.get('/kreditoren', authenticateToken, async (req, res) => {
// Create new kreditor
router.post('/kreditoren', authenticateToken, async (req, res) => {
const { iban, name, kreditorId } = req.body;
const { iban, name, kreditorId, is_banking } = req.body;
if (!iban || !name || !kreditorId) {
return res.status(400).json({ error: 'IBAN, Name und Kreditor ID sind erforderlich' });
// IBAN is optional for banking accounts or manual kreditor assignments
const isBanking = is_banking || false;
if (!name || !kreditorId) {
return res.status(400).json({ error: 'Name und Kreditor ID sind erforderlich' });
}
// IBAN validation - required for non-banking accounts
if (!isBanking && (!iban || iban.trim() === '')) {
return res.status(400).json({ error: 'IBAN ist erforderlich (außer für Banking-Konten)' });
}
try {
await executeQuery(
'INSERT INTO fibdash.Kreditor (iban, name, kreditorId) VALUES (@iban, @name, @kreditorId)',
{ iban, name, kreditorId }
'INSERT INTO fibdash.Kreditor (iban, name, kreditorId, is_banking) VALUES (@iban, @name, @kreditorId, @is_banking)',
{ iban: iban || null, name, kreditorId, is_banking: isBanking }
);
res.json({ message: 'Kreditor erfolgreich erstellt' });
} catch (error) {
console.error('Error creating kreditor:', error);
if (error.number === 2627) { // Unique constraint violation
res.status(400).json({ error: 'Kreditor ID bereits vorhanden' });
res.status(400).json({ error: 'IBAN oder Kreditor ID bereits vorhanden' });
} else {
res.status(500).json({ error: 'Fehler beim Erstellen des Kreditors' });
}
@@ -57,22 +65,30 @@ router.post('/kreditoren', authenticateToken, async (req, res) => {
// Update kreditor
router.put('/kreditoren/:id', authenticateToken, async (req, res) => {
const { id } = req.params;
const { iban, name, kreditorId } = req.body;
const { iban, name, kreditorId, is_banking } = req.body;
if (!iban || !name || !kreditorId) {
return res.status(400).json({ error: 'IBAN, Name und Kreditor ID sind erforderlich' });
// IBAN is optional for banking accounts or manual kreditor assignments
const isBanking = is_banking || false;
if (!name || !kreditorId) {
return res.status(400).json({ error: 'Name und Kreditor ID sind erforderlich' });
}
// IBAN validation - required for non-banking accounts
if (!isBanking && (!iban || iban.trim() === '')) {
return res.status(400).json({ error: 'IBAN ist erforderlich (außer für Banking-Konten)' });
}
try {
await executeQuery(
'UPDATE fibdash.Kreditor SET iban = @iban, name = @name, kreditorId = @kreditorId WHERE id = @id',
{ iban, name, kreditorId, id }
'UPDATE fibdash.Kreditor SET iban = @iban, name = @name, kreditorId = @kreditorId, is_banking = @is_banking WHERE id = @id',
{ iban: iban || null, name, kreditorId, is_banking: isBanking, id }
);
res.json({ message: 'Kreditor erfolgreich aktualisiert' });
} catch (error) {
console.error('Error updating kreditor:', error);
if (error.number === 2627) { // Unique constraint violation
res.status(400).json({ error: 'Kreditor ID bereits vorhanden' });
res.status(400).json({ error: 'IBAN oder Kreditor ID bereits vorhanden' });
} else {
res.status(500).json({ error: 'Fehler beim Aktualisieren des Kreditors' });
}

View File

@@ -239,7 +239,7 @@ router.get('/transactions/:timeRange', authenticateToken, async (req, res) => {
let kreditorData = [];
try {
const { executeQuery } = require('../config/database');
const kreditorQuery = `SELECT id, iban, name, kreditorId FROM fibdash.Kreditor`;
const kreditorQuery = `SELECT id, iban, name, kreditorId, is_banking FROM fibdash.Kreditor`;
const kreditorResult = await executeQuery(kreditorQuery);
kreditorData = kreditorResult.recordset || [];
} catch (error) {
@@ -284,7 +284,8 @@ router.get('/transactions/:timeRange', authenticateToken, async (req, res) => {
id: kreditorMatch.id,
name: kreditorMatch.name,
kreditorId: kreditorMatch.kreditorId,
iban: kreditorMatch.iban
iban: kreditorMatch.iban,
is_banking: Boolean(kreditorMatch.is_banking)
} : null,
hasKreditor: !!kreditorMatch
};
@@ -612,7 +613,7 @@ router.get('/kreditors/:id', authenticateToken, async (req, res) => {
const { id } = req.params;
const query = `
SELECT id, iban, name, kreditorId
SELECT id, iban, name, kreditorId, is_banking
FROM fibdash.Kreditor
WHERE id = @id
`;
@@ -634,32 +635,47 @@ router.get('/kreditors/:id', authenticateToken, async (req, res) => {
router.post('/kreditors', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../config/database');
const { iban, name, kreditorId } = req.body;
const { iban, name, kreditorId, is_banking } = req.body;
// IBAN is optional for banking accounts or manual kreditor assignments
const isBanking = is_banking || false;
// Validate required fields
if (!iban || !name || !kreditorId) {
return res.status(400).json({ error: 'IBAN, name, and kreditorId are required' });
if (!name || !kreditorId) {
return res.status(400).json({ error: 'Name and kreditorId are required' });
}
// Check if IBAN already exists (only IBAN needs to be unique)
const checkQuery = `
SELECT id FROM fibdash.Kreditor
WHERE iban = @iban
`;
// IBAN validation - required for non-banking accounts
if (!isBanking && (!iban || iban.trim() === '')) {
return res.status(400).json({ error: 'IBAN is required (except for banking accounts)' });
}
const checkResult = await executeQuery(checkQuery, { iban });
if (checkResult.recordset.length > 0) {
return res.status(409).json({ error: 'Kreditor with this IBAN already exists' });
// Check if IBAN already exists (only if IBAN is provided)
if (iban && iban.trim() !== '') {
const checkQuery = `
SELECT id FROM fibdash.Kreditor
WHERE iban = @iban
`;
const checkResult = await executeQuery(checkQuery, { iban });
if (checkResult.recordset.length > 0) {
return res.status(409).json({ error: 'Kreditor with this IBAN already exists' });
}
}
const insertQuery = `
INSERT INTO fibdash.Kreditor (iban, name, kreditorId)
OUTPUT INSERTED.id, INSERTED.iban, INSERTED.name, INSERTED.kreditorId
VALUES (@iban, @name, @kreditorId)
INSERT INTO fibdash.Kreditor (iban, name, kreditorId, is_banking)
OUTPUT INSERTED.id, INSERTED.iban, INSERTED.name, INSERTED.kreditorId, INSERTED.is_banking
VALUES (@iban, @name, @kreditorId, @is_banking)
`;
const result = await executeQuery(insertQuery, { iban, name, kreditorId });
const result = await executeQuery(insertQuery, {
iban: iban || null,
name,
kreditorId,
is_banking: isBanking
});
res.status(201).json(result.recordset[0]);
} catch (error) {
@@ -673,11 +689,19 @@ router.put('/kreditors/:id', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../config/database');
const { id } = req.params;
const { iban, name, kreditorId } = req.body;
const { iban, name, kreditorId, is_banking } = req.body;
// IBAN is optional for banking accounts or manual kreditor assignments
const isBanking = is_banking || false;
// Validate required fields
if (!iban || !name || !kreditorId) {
return res.status(400).json({ error: 'IBAN, name, and kreditorId are required' });
if (!name || !kreditorId) {
return res.status(400).json({ error: 'Name and kreditorId are required' });
}
// IBAN validation - required for non-banking accounts
if (!isBanking && (!iban || iban.trim() === '')) {
return res.status(400).json({ error: 'IBAN is required (except for banking accounts)' });
}
// Check if kreditor exists
@@ -688,26 +712,34 @@ router.put('/kreditors/:id', authenticateToken, async (req, res) => {
return res.status(404).json({ error: 'Kreditor not found' });
}
// Check for conflicts with other kreditors (only IBAN needs to be unique)
const conflictQuery = `
SELECT id FROM fibdash.Kreditor
WHERE iban = @iban AND id != @id
`;
const conflictResult = await executeQuery(conflictQuery, { iban, id: parseInt(id) });
if (conflictResult.recordset.length > 0) {
return res.status(409).json({ error: 'Another kreditor with this IBAN already exists' });
// Check for conflicts with other kreditors (only if IBAN is provided)
if (iban && iban.trim() !== '') {
const conflictQuery = `
SELECT id FROM fibdash.Kreditor
WHERE iban = @iban AND id != @id
`;
const conflictResult = await executeQuery(conflictQuery, { iban, id: parseInt(id) });
if (conflictResult.recordset.length > 0) {
return res.status(409).json({ error: 'Another kreditor with this IBAN already exists' });
}
}
const updateQuery = `
UPDATE fibdash.Kreditor
SET iban = @iban, name = @name, kreditorId = @kreditorId
OUTPUT INSERTED.id, INSERTED.iban, INSERTED.name, INSERTED.kreditorId
SET iban = @iban, name = @name, kreditorId = @kreditorId, is_banking = @is_banking
OUTPUT INSERTED.id, INSERTED.iban, INSERTED.name, INSERTED.kreditorId, INSERTED.is_banking
WHERE id = @id
`;
const result = await executeQuery(updateQuery, { iban, name, kreditorId, id: parseInt(id) });
const result = await executeQuery(updateQuery, {
iban: iban || null,
name,
kreditorId,
is_banking: isBanking,
id: parseInt(id)
});
res.json(result.recordset[0]);
} catch (error) {
@@ -740,4 +772,545 @@ router.delete('/kreditors/:id', authenticateToken, async (req, res) => {
}
});
// Banking Account Transactions endpoints
// Get banking account transactions for a specific transaction
router.get('/banking-transactions/:transactionId', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../config/database');
const { transactionId } = req.params;
const query = `
SELECT
bat.*,
k.name as assigned_kreditor_name,
k.kreditorId as assigned_kreditor_id_code
FROM fibdash.BankingAccountTransactions bat
LEFT JOIN fibdash.Kreditor k ON bat.assigned_kreditor_id = k.id
WHERE bat.transaction_id = @transactionId OR bat.csv_transaction_id = @transactionId
`;
const result = await executeQuery(query, { transactionId: parseInt(transactionId) });
res.json(result.recordset);
} catch (error) {
console.error('Error fetching banking account transactions:', error);
res.status(500).json({ error: 'Failed to fetch banking account transactions' });
}
});
// Create banking account transaction assignment
router.post('/banking-transactions', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../config/database');
const { transaction_id, csv_transaction_id, banking_iban, assigned_kreditor_id, notes, assigned_by } = req.body;
// Validate required fields - need either transaction_id or csv_transaction_id
if ((!transaction_id && !csv_transaction_id) || !banking_iban || !assigned_kreditor_id) {
return res.status(400).json({
error: 'Transaction ID (or CSV Transaction ID), banking IBAN, and assigned kreditor ID are required'
});
}
// Check if assignment already exists
const checkQuery = `
SELECT id FROM fibdash.BankingAccountTransactions
WHERE transaction_id = @transaction_id OR csv_transaction_id = @csv_transaction_id
`;
const checkResult = await executeQuery(checkQuery, {
transaction_id: transaction_id || null,
csv_transaction_id: csv_transaction_id || null
});
if (checkResult.recordset.length > 0) {
return res.status(409).json({ error: 'Banking transaction assignment already exists' });
}
const insertQuery = `
INSERT INTO fibdash.BankingAccountTransactions
(transaction_id, csv_transaction_id, banking_iban, assigned_kreditor_id, notes, assigned_by)
OUTPUT INSERTED.*
VALUES (@transaction_id, @csv_transaction_id, @banking_iban, @assigned_kreditor_id, @notes, @assigned_by)
`;
const result = await executeQuery(insertQuery, {
transaction_id: transaction_id || null,
csv_transaction_id: csv_transaction_id || null,
banking_iban,
assigned_kreditor_id,
notes: notes || null,
assigned_by: assigned_by || null
});
res.status(201).json(result.recordset[0]);
} catch (error) {
console.error('Error creating banking account transaction:', error);
res.status(500).json({ error: 'Failed to create banking account transaction' });
}
});
// Update banking account transaction assignment
router.put('/banking-transactions/:id', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../config/database');
const { id } = req.params;
const { assigned_kreditor_id, notes, assigned_by } = req.body;
// Validate required fields
if (!assigned_kreditor_id) {
return res.status(400).json({ error: 'Assigned kreditor ID is required' });
}
const updateQuery = `
UPDATE fibdash.BankingAccountTransactions
SET assigned_kreditor_id = @assigned_kreditor_id,
notes = @notes,
assigned_by = @assigned_by,
assigned_date = GETDATE()
OUTPUT INSERTED.*
WHERE id = @id
`;
const result = await executeQuery(updateQuery, {
assigned_kreditor_id,
notes: notes || null,
assigned_by: assigned_by || null,
id: parseInt(id)
});
if (result.recordset.length === 0) {
return res.status(404).json({ error: 'Banking transaction assignment not found' });
}
res.json(result.recordset[0]);
} catch (error) {
console.error('Error updating banking account transaction:', error);
res.status(500).json({ error: 'Failed to update banking account transaction' });
}
});
// Delete banking account transaction assignment
router.delete('/banking-transactions/:id', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../config/database');
const { id } = req.params;
const deleteQuery = `
DELETE FROM fibdash.BankingAccountTransactions
WHERE id = @id
`;
const result = await executeQuery(deleteQuery, { id: parseInt(id) });
res.json({ message: 'Banking transaction assignment deleted successfully' });
} catch (error) {
console.error('Error deleting banking account transaction:', error);
res.status(500).json({ error: 'Failed to delete banking account transaction' });
}
});
// Get all kreditors that can be assigned to banking transactions (non-banking kreditors)
router.get('/assignable-kreditors', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../config/database');
const query = `
SELECT id, name, kreditorId
FROM fibdash.Kreditor
WHERE is_banking = 0
ORDER BY name
`;
const result = await executeQuery(query);
res.json(result.recordset);
} catch (error) {
console.error('Error fetching assignable kreditors:', error);
res.status(500).json({ error: 'Failed to fetch assignable kreditors' });
}
});
// CSV Import endpoints
// Test CSV import endpoint (no auth for testing) - ACTUALLY IMPORTS TO DATABASE
router.post('/test-csv-import', async (req, res) => {
try {
const { executeQuery } = require('../config/database');
const { transactions, filename, batchId, headers } = req.body;
if (!transactions || !Array.isArray(transactions)) {
return res.status(400).json({ error: 'Transactions array is required' });
}
const importBatchId = batchId || `test_import_${Date.now()}`;
let successCount = 0;
let errorCount = 0;
const errors = [];
for (let i = 0; i < transactions.length; i++) {
const transaction = transactions[i];
try {
// Validate required fields
const validationErrors = [];
if (!transaction['Buchungstag'] || transaction['Buchungstag'].trim() === '') {
validationErrors.push('Buchungstag is required');
}
if (!transaction['Betrag'] || transaction['Betrag'].toString().trim() === '') {
validationErrors.push('Betrag is required');
}
if (validationErrors.length > 0) {
errors.push({
row: i + 1,
error: `Validation failed: ${validationErrors.join(', ')}`,
transaction: transaction
});
errorCount++;
continue;
}
// Parse the date
let parsedDate = null;
if (transaction['Buchungstag']) {
const dateStr = transaction['Buchungstag'].trim();
const dateParts = dateStr.split(/[.\/\-]/);
if (dateParts.length === 3) {
const day = parseInt(dateParts[0]);
const month = parseInt(dateParts[1]) - 1;
let year = parseInt(dateParts[2]);
if (year < 100) {
year += (year < 50) ? 2000 : 1900;
}
parsedDate = new Date(year, month, day);
if (isNaN(parsedDate.getTime())) {
parsedDate = null;
validationErrors.push(`Invalid date format: ${dateStr}`);
}
}
}
// Parse the amount
let numericAmount = 0;
if (transaction['Betrag']) {
const amountStr = transaction['Betrag'].toString().replace(/[^\d,.-]/g, '');
const normalizedAmount = amountStr.replace(',', '.');
numericAmount = parseFloat(normalizedAmount) || 0;
}
const insertQuery = `
INSERT INTO fibdash.CSVTransactions
(buchungstag, wertstellung, umsatzart, betrag, betrag_original, waehrung,
beguenstigter_zahlungspflichtiger, kontonummer_iban, bic, verwendungszweck,
parsed_date, numeric_amount, import_batch_id, source_filename, source_row_number)
VALUES
(@buchungstag, @wertstellung, @umsatzart, @betrag, @betrag_original, @waehrung,
@beguenstigter_zahlungspflichtiger, @kontonummer_iban, @bic, @verwendungszweck,
@parsed_date, @numeric_amount, @import_batch_id, @source_filename, @source_row_number)
`;
await executeQuery(insertQuery, {
buchungstag: transaction['Buchungstag'] || null,
wertstellung: transaction['Valutadatum'] || null,
umsatzart: transaction['Buchungstext'] || null,
betrag: numericAmount,
betrag_original: transaction['Betrag'] || null,
waehrung: transaction['Waehrung'] || null,
beguenstigter_zahlungspflichtiger: transaction['Beguenstigter/Zahlungspflichtiger'] || null,
kontonummer_iban: transaction['Kontonummer/IBAN'] || null,
bic: transaction['BIC (SWIFT-Code)'] || null,
verwendungszweck: transaction['Verwendungszweck'] || null,
parsed_date: parsedDate,
numeric_amount: numericAmount,
import_batch_id: importBatchId,
source_filename: filename || 'test_import',
source_row_number: i + 1
});
successCount++;
} catch (error) {
console.error(`Error importing transaction ${i + 1}:`, error);
errors.push({
row: i + 1,
error: error.message,
transaction: transaction
});
errorCount++;
}
}
res.json({
success: true,
batchId: importBatchId,
imported: successCount,
errors: errorCount,
details: errors.length > 0 ? errors : undefined,
paypalTransaction: transactions.find(t => t['Kontonummer/IBAN'] === 'LU89751000135104200E')
});
} catch (error) {
console.error('Test import error:', error);
res.status(500).json({ error: 'Test import failed' });
}
});
// Import CSV transactions to database
router.post('/import-csv-transactions', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../config/database');
const { transactions, filename, batchId, headers } = req.body;
if (!transactions || !Array.isArray(transactions)) {
return res.status(400).json({ error: 'Transactions array is required' });
}
// Expected CSV headers (German bank format)
const expectedHeaders = [
'Auftragskonto',
'Buchungstag',
'Valutadatum',
'Buchungstext',
'Verwendungszweck',
'Glaeubiger ID',
'Mandatsreferenz',
'Kundenreferenz (End-to-End)',
'Sammlerreferenz',
'Lastschrift Ursprungsbetrag',
'Auslagenersatz Ruecklastschrift',
'Beguenstigter/Zahlungspflichtiger',
'Kontonummer/IBAN',
'BIC (SWIFT-Code)',
'Betrag',
'Waehrung',
'Info'
];
// Validate headers if provided
if (headers && Array.isArray(headers)) {
const missingHeaders = expectedHeaders.filter(expected =>
!headers.some(header => header.trim() === expected)
);
if (missingHeaders.length > 0) {
return res.status(400).json({
error: 'Invalid CSV format - missing required headers',
missing: missingHeaders,
expected: expectedHeaders,
received: headers
});
}
}
// Validate that we have transactions
if (transactions.length === 0) {
return res.status(400).json({ error: 'No transaction data found' });
}
const importBatchId = batchId || `import_${Date.now()}`;
let successCount = 0;
let errorCount = 0;
const errors = [];
for (let i = 0; i < transactions.length; i++) {
const transaction = transactions[i];
try {
// Validate required fields for each transaction
const validationErrors = [];
if (!transaction['Buchungstag'] || transaction['Buchungstag'].trim() === '') {
validationErrors.push('Buchungstag is required');
}
if (!transaction['Betrag'] || transaction['Betrag'].toString().trim() === '') {
validationErrors.push('Betrag is required');
}
if (!transaction['Beguenstigter/Zahlungspflichtiger'] || transaction['Beguenstigter/Zahlungspflichtiger'].trim() === '') {
validationErrors.push('Beguenstigter/Zahlungspflichtiger is required');
}
// Skip rows that are clearly invalid (like headers or empty rows)
if (validationErrors.length > 2) {
console.log(`Skipping invalid row ${i + 1}:`, validationErrors);
continue;
}
if (validationErrors.length > 0) {
errors.push({
row: i + 1,
error: `Validation failed: ${validationErrors.join(', ')}`,
transaction: transaction
});
errorCount++;
continue;
}
// Parse the date
let parsedDate = null;
if (transaction['Buchungstag']) {
const dateStr = transaction['Buchungstag'].trim();
// Try different date formats (DD.MM.YY, DD.MM.YYYY, DD/MM/YYYY, etc.)
const dateParts = dateStr.split(/[.\/\-]/);
if (dateParts.length === 3) {
const day = parseInt(dateParts[0]);
const month = parseInt(dateParts[1]) - 1; // JavaScript months are 0-based
let year = parseInt(dateParts[2]);
// Handle 2-digit years (assume 21.07.25 means 2025)
if (year < 100) {
year += (year < 50) ? 2000 : 1900; // 00-49 = 2000-2049, 50-99 = 1950-1999
}
parsedDate = new Date(year, month, day);
// Validate the date
if (isNaN(parsedDate.getTime()) ||
parsedDate.getDate() !== day ||
parsedDate.getMonth() !== month ||
parsedDate.getFullYear() !== year) {
parsedDate = null;
validationErrors.push(`Invalid date format: ${dateStr}`);
}
} else {
validationErrors.push(`Invalid date format: ${dateStr}`);
}
}
// Parse the amount
let numericAmount = 0;
if (transaction['Betrag']) {
const amountStr = transaction['Betrag'].toString().replace(/[^\d,.-]/g, '');
const normalizedAmount = amountStr.replace(',', '.');
numericAmount = parseFloat(normalizedAmount) || 0;
}
const insertQuery = `
INSERT INTO fibdash.CSVTransactions
(buchungstag, wertstellung, umsatzart, betrag, betrag_original, waehrung,
beguenstigter_zahlungspflichtiger, kontonummer_iban, bic, verwendungszweck,
parsed_date, numeric_amount, import_batch_id, source_filename, source_row_number)
VALUES
(@buchungstag, @wertstellung, @umsatzart, @betrag, @betrag_original, @waehrung,
@beguenstigter_zahlungspflichtiger, @kontonummer_iban, @bic, @verwendungszweck,
@parsed_date, @numeric_amount, @import_batch_id, @source_filename, @source_row_number)
`;
await executeQuery(insertQuery, {
buchungstag: transaction['Buchungstag'] || null,
wertstellung: transaction['Valutadatum'] || null,
umsatzart: transaction['Buchungstext'] || null,
betrag: numericAmount,
betrag_original: transaction['Betrag'] || null,
waehrung: transaction['Waehrung'] || null,
beguenstigter_zahlungspflichtiger: transaction['Beguenstigter/Zahlungspflichtiger'] || null,
kontonummer_iban: transaction['Kontonummer/IBAN'] || null,
bic: transaction['BIC (SWIFT-Code)'] || null,
verwendungszweck: transaction['Verwendungszweck'] || null,
parsed_date: parsedDate,
numeric_amount: numericAmount,
import_batch_id: importBatchId,
source_filename: filename || null,
source_row_number: i + 1
});
successCount++;
} catch (error) {
console.error(`Error importing transaction ${i + 1}:`, error);
errors.push({
row: i + 1,
error: error.message,
transaction: transaction
});
errorCount++;
}
}
res.json({
success: true,
batchId: importBatchId,
imported: successCount,
errors: errorCount,
details: errors.length > 0 ? errors : undefined
});
} catch (error) {
console.error('Error importing CSV transactions:', error);
res.status(500).json({ error: 'Failed to import CSV transactions' });
}
});
// Get imported CSV transactions
router.get('/csv-transactions', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../config/database');
const { batchId, limit = 100, offset = 0 } = req.query;
let query = `
SELECT
csv.*,
k.name as kreditor_name,
k.kreditorId as kreditor_id,
k.is_banking as kreditor_is_banking,
bat.assigned_kreditor_id,
ak.name as assigned_kreditor_name
FROM fibdash.CSVTransactions csv
LEFT JOIN fibdash.Kreditor k ON csv.kontonummer_iban = k.iban
LEFT JOIN fibdash.BankingAccountTransactions bat ON csv.id = bat.csv_transaction_id
LEFT JOIN fibdash.Kreditor ak ON bat.assigned_kreditor_id = ak.id
`;
const params = {};
if (batchId) {
query += ' WHERE csv.import_batch_id = @batchId';
params.batchId = batchId;
}
query += ' ORDER BY csv.parsed_date DESC, csv.id DESC';
query += ' OFFSET @offset ROWS FETCH NEXT @limit ROWS ONLY';
params.offset = parseInt(offset);
params.limit = parseInt(limit);
const result = await executeQuery(query, params);
res.json(result.recordset);
} catch (error) {
console.error('Error fetching CSV transactions:', error);
res.status(500).json({ error: 'Failed to fetch CSV transactions' });
}
});
// Get CSV import batches
router.get('/csv-import-batches', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../config/database');
const query = `
SELECT
import_batch_id,
source_filename,
MIN(import_date) as import_date,
COUNT(*) as transaction_count,
SUM(CASE WHEN is_processed = 1 THEN 1 ELSE 0 END) as processed_count
FROM fibdash.CSVTransactions
GROUP BY import_batch_id, source_filename
ORDER BY MIN(import_date) DESC
`;
const result = await executeQuery(query);
res.json(result.recordset);
} catch (error) {
console.error('Error fetching import batches:', error);
res.status(500).json({ error: 'Failed to fetch import batches' });
}
});
module.exports = router;