Compare commits

...

11 Commits

Author SHA1 Message Date
sebseb7
adfcd90dcf Enhance transaction filtering by time range in API
- Implemented dynamic SQL WHERE clause to filter transactions based on various time range formats: quarter, year, and month.
- Removed redundant post-processing logic for filtering transactions, as the SQL query now handles this directly.
- Updated summary calculations to reflect the new transaction filtering approach, ensuring accurate reporting of totals and JTL matches.
2025-08-23 04:17:17 +02:00
sebseb7
bb610e0480 Add duplicate transaction check in CSV import process
- Implemented a check for existing transactions in the database to prevent duplicates during CSV imports.
- Added SQL query to count existing transactions based on key fields before insertion.
- Enhanced error handling to log and skip duplicate transactions, improving data integrity during the import process.
2025-08-22 23:46:20 +02:00
sebseb7
44d6cf6352 Update CSV import queries to include 'pending' status for datevlink field
- Modified SQL update queries in the csvImport.js file to allow for 'pending' status in addition to NULL or empty values for the datevlink field in both tUmsatzBeleg and tPdfObjekt tables. This change enhances the handling of datevlink updates during the import process.
2025-08-22 23:23:58 +02:00
sebseb7
74529d8b19 Enhanced error handling and logging for the DATEV export process. 2025-08-21 11:41:17 +02:00
sebseb7
bd7c6dddbf Enhance CSV import functionality with improved error messaging and logging
- Updated error message in CSVImportPanel to include a period for better readability.
- Added console logs in the CSV import API route to track the import process and precheck status.
- Removed redundant validation for 'Beguenstigter/Zahlungspflichtiger' to streamline error handling during CSV import.
2025-08-21 04:46:30 +02:00
sebseb7
8e8d93e4a6 Implement Google OAuth flow and enhance login functionality
- Updated the Google Sign-In integration to utilize the new OAuth callback mechanism.
- Added a redirect flow for Google authentication, improving user experience.
- Enhanced error handling and user feedback during the login process.
- Removed hardcoded Google client ID in favor of environment variable usage.
- Introduced a new component for handling OAuth callbacks and updated the App component to manage authentication states accordingly.
- Improved API route for processing OAuth callbacks, including token exchange and user verification.
2025-08-15 19:48:45 +02:00
sebseb7
fee9f02faa Enhance Accounting Items Management with JTL Kontierung Integration
- Added a new API route to fetch JTL Kontierung data based on transaction ID.
- Implemented loading of JTL Kontierung data in the AccountingItemsManager component.
- Updated UI to display JTL Kontierung data for debugging purposes.
- Enhanced user feedback during processing tasks in the App component with tooltips and progress indicators.
2025-08-08 11:32:57 +02:00
sebseb7
bcd7eea1b4 Update API target and port configuration; upgrade OpenAI model version
- Changed API proxy target from localhost:5000 to localhost:5500 in both webpack configurations.
- Updated server port from 5000 to 5500 in src/index.js for consistency.
- Upgraded OpenAI model from "gpt-4o-mini" to "gpt-5-mini" in document processing routes, enhancing processing capabilities.
2025-08-08 10:31:33 +02:00
sebseb7
281754de22 Add OpenAI API integration and document processing features
- Added OpenAI API key configuration to .env.example.
- Integrated OpenAI for document processing, including markdown conversion and data extraction.
- Implemented new API routes for fetching document processing status and handling various processing tasks.
- Enhanced the App component to manage document status and processing states with user feedback via Snackbar.
- Updated CSVImportPanel and TableManagement components to support navigation to specific tabs based on processing results.
- Introduced transaction handling in the database configuration for improved error management during document processing.
2025-08-06 11:11:23 +02:00
sebseb7
d60da0a7aa Refactor CSVImportDialog to CSVImportPanel and enhance UI components
- Renamed CSVImportDialog component to CSVImportPanel for clarity.
- Replaced Dialog with Paper component for improved layout.
- Removed unused code and comments to streamline the component.
- Updated import result messages for better user feedback.
- Enhanced button styles and layout for a more user-friendly interface.
- Added new API route for importing DATEV Beleglinks to the database, including validation and error handling.
2025-08-05 10:17:54 +02:00
sebseb7
46c9e9b97d Add Accounting Items Management and SQL Integration
- Introduced AccountingItemsManager component for managing accounting entries within transactions.
- Implemented API routes for creating, retrieving, updating, and deleting accounting items.
- Added SQL queries to handle accounting items linked to transactions, supporting both numeric and string transaction IDs.
- Enhanced CSV import functionality to include new accounting item handling.
- Created mssql.md documentation for SQL command usage related to accounting items.
2025-08-05 09:25:32 +02:00
26 changed files with 2927 additions and 524 deletions

View File

@@ -1,6 +1,4 @@
---
alwaysApply: true
---
sqlcmd -C -S tcp:192.168.56.1,1497 -U app -P 'readonly' -d eazybusiness -W
sqlcmd -C -S tcp:192.168.56.1,1497 -U sa -P 'sa_tekno23' -d eazybusiness -W

View File

@@ -0,0 +1,7 @@
---
alwaysApply: true
---
pm2 restart 10 -> restart backend (configured as "npm run dev:backend")
pm2 restart 11 -> restart backend (configured as "npm run dev:frontend")
(both should rarely neer restart because in dev mode HMR for frontend, and nodemon for backend should already do that)

View File

@@ -8,6 +8,9 @@ REACT_APP_GOOGLE_CLIENT_ID=your_google_client_id_here
# JWT Secret
JWT_SECRET=your_jwt_secret_here
# OpenAI API Configuration
OPENAI_API_KEY=your_openai_api_key_here
# Authorized Email Addresses (comma-separated)
AUTHORIZED_EMAILS=admin@example.com,user1@example.com,user2@example.com

4
.kilocode/rules/mssql.md Normal file
View File

@@ -0,0 +1,4 @@
# mssql.md
sqlcmd -C -S tcp:192.168.56.1,1497 -U app -P 'readonly' -d eazybusiness -W

View File

@@ -1,14 +1,20 @@
import React, { Component } from 'react';
import { ThemeProvider, createTheme } from '@mui/material/styles';
import CssBaseline from '@mui/material/CssBaseline';
import { Container, AppBar, Toolbar, Typography, Button, Box, Tabs, Tab } from '@mui/material';
import { Container, AppBar, Toolbar, Typography, Button, Box, Tabs, Tab, Badge, Chip, Divider, Snackbar, Alert, LinearProgress, Tooltip, CircularProgress } from '@mui/material';
import LoginIcon from '@mui/icons-material/Login';
import DashboardIcon from '@mui/icons-material/Dashboard';
import DownloadIcon from '@mui/icons-material/Download';
import TableChart from '@mui/icons-material/TableChart';
import PlayArrowIcon from '@mui/icons-material/PlayArrow';
import DocumentScannerIcon from '@mui/icons-material/DocumentScanner';
import ExtractIcon from '@mui/icons-material/TextSnippet';
import EmailIcon from '@mui/icons-material/Email';
import UploadIcon from '@mui/icons-material/Upload';
import AuthService from './services/AuthService';
import DataViewer from './components/DataViewer';
import Login from './components/Login';
import OAuthCallback from './components/OAuthCallback';
const theme = createTheme({
palette: {
@@ -31,6 +37,18 @@ class App extends Component {
loading: true,
exportData: null, // { selectedMonth, canExport, onExport }
currentView: 'dashboard', // 'dashboard' or 'tables'
documentStatus: null,
processingStatus: {
markdown: false,
extraction: false,
datevSync: false,
datevUpload: false
},
snackbar: {
open: false,
message: '',
severity: 'info' // 'success', 'error', 'warning', 'info'
}
};
this.authService = new AuthService();
}
@@ -39,6 +57,15 @@ class App extends Component {
this.checkAuthStatus();
}
componentDidUpdate(prevState) {
// Clear targetTab after navigation is complete
if (this.state.targetTab && prevState.currentView !== this.state.currentView) {
setTimeout(() => {
this.setState({ targetTab: null });
}, 100); // Small delay to ensure navigation completes
}
}
checkAuthStatus = async () => {
try {
const token = localStorage.getItem('token');
@@ -46,6 +73,7 @@ class App extends Component {
const user = await this.authService.verifyToken(token);
if (user) {
this.setState({ isAuthenticated: true, user, loading: false });
this.fetchDocumentStatus();
return;
}
}
@@ -62,6 +90,7 @@ class App extends Component {
if (result.success) {
localStorage.setItem('token', result.token);
this.setState({ isAuthenticated: true, user: result.user });
this.fetchDocumentStatus();
}
} catch (error) {
console.error('Login failed:', error);
@@ -83,8 +112,135 @@ class App extends Component {
this.setState({ currentView: newValue });
};
showSnackbar = (message, severity = 'info') => {
this.setState({
snackbar: {
open: true,
message,
severity
}
});
};
handleSnackbarClose = (event, reason) => {
if (reason === 'clickaway') {
return;
}
this.setState({
snackbar: {
...this.state.snackbar,
open: false
}
});
};
fetchDocumentStatus = async () => {
try {
const token = localStorage.getItem('token');
if (!token) {
console.log('No token found for document status');
return;
}
console.log('Fetching document status...');
const response = await fetch('/api/data/document-status', {
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json',
},
});
if (response.ok) {
const status = await response.json();
console.log('Document status received:', status);
this.setState({ documentStatus: status });
} else {
console.error('Failed to fetch document status:', response.status, await response.text());
}
} catch (error) {
console.error('Error fetching document status:', error);
}
};
handleProcessing = async (processType) => {
if (this.state.processingStatus[processType]) {
return; // Already processing
}
// Handle datev upload navigation
if (processType === 'datev-upload') {
this.setState({
currentView: 'tables',
targetTab: {
level1: 3, // CSV Import tab
level2: 'DATEV_LINKS' // DATEV Beleglinks tab
}
});
return;
}
// Check if there are documents to process
const statusKey = processType === 'datev-sync' ? 'needDatevSync' :
processType === 'extraction' ? 'needExtraction' : 'needMarkdown';
if (!this.state.documentStatus || this.state.documentStatus[statusKey] === 0) {
this.showSnackbar(`No documents need ${processType} processing at this time.`, 'info');
return;
}
this.setState(prevState => ({
processingStatus: {
...prevState.processingStatus,
[processType]: true
}
}));
try {
const token = localStorage.getItem('token');
if (!token) return;
const response = await fetch(`/api/data/process-${processType}`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json',
},
});
if (response.ok) {
const result = await response.json();
console.log(`${processType} processing result:`, result);
this.showSnackbar(`${processType} processing completed successfully!`, 'success');
// Refresh document status after successful processing
await this.fetchDocumentStatus();
} else {
const error = await response.json();
console.error(`Failed to process ${processType}:`, error);
this.showSnackbar(`Failed to process ${processType}: ${error.error || response.status}`, 'error');
}
} catch (error) {
console.error(`Error processing ${processType}:`, error);
this.showSnackbar(`Error processing ${processType}: ${error.message}`, 'error');
} finally {
this.setState(prevState => ({
processingStatus: {
...prevState.processingStatus,
[processType]: false
}
}));
}
};
isOAuthCallback = () => {
return window.location.pathname === '/auth/callback';
};
render() {
const { isAuthenticated, user, loading, currentView } = this.state;
const { isAuthenticated, user, loading, currentView, documentStatus, processingStatus, snackbar } = this.state;
// Debug logging
console.log('App render - documentStatus:', documentStatus);
console.log('App render - isAuthenticated:', isAuthenticated);
if (loading) {
return (
@@ -149,6 +305,112 @@ class App extends Component {
sx={{ minHeight: 48 }}
/>
</Tabs>
<Divider orientation="vertical" flexItem sx={{ mx: 2, backgroundColor: 'rgba(255, 255, 255, 0.3)' }} />
<Box sx={{ display: 'flex', alignItems: 'center', gap: 1 }}>
<Tooltip title={processingStatus.markdown ? 'Running markdown conversion… this can take a while' : 'Process markdown conversion'} arrow>
<span>
<Button
color="inherit"
size="small"
onClick={() => this.handleProcessing('markdown')}
disabled={processingStatus.markdown || !documentStatus}
sx={{
minWidth: 'auto',
px: 1,
'&:hover': { backgroundColor: 'rgba(255, 255, 255, 0.1)' }
}}
>
<Badge
badgeContent={documentStatus?.needMarkdown || 0}
color={documentStatus?.needMarkdown > 0 ? "error" : "default"}
max={999999}
sx={{ mr: 0.5 }}
>
<DocumentScannerIcon fontSize="small" />
</Badge>
{processingStatus.markdown && (
<CircularProgress size={14} color="inherit" />
)}
</Button>
</span>
</Tooltip>
<Tooltip title={processingStatus.extraction ? 'Running data extraction… this can take a while' : 'Process data extraction'} arrow>
<span>
<Button
color="inherit"
size="small"
onClick={() => this.handleProcessing('extraction')}
disabled={processingStatus.extraction || !documentStatus}
sx={{
minWidth: 'auto',
px: 1,
'&:hover': { backgroundColor: 'rgba(255, 255, 255, 0.1)' }
}}
>
<Badge
badgeContent={documentStatus?.needExtraction || 0}
color={documentStatus?.needExtraction > 0 ? "warning" : "default"}
max={999999}
sx={{ mr: 0.5 }}
>
<ExtractIcon fontSize="small" />
</Badge>
{processingStatus.extraction && (
<CircularProgress size={14} color="inherit" />
)}
</Button>
</span>
</Tooltip>
<Tooltip title={processingStatus.datevSync ? 'Running DATEV sync… this can take a while' : 'Process Datev sync'} arrow>
<span>
<Button
color="inherit"
size="small"
onClick={() => this.handleProcessing('datev-sync')}
disabled={processingStatus.datevSync || !documentStatus}
sx={{
minWidth: 'auto',
px: 1,
'&:hover': { backgroundColor: 'rgba(255, 255, 255, 0.1)' }
}}
>
<Badge
badgeContent={documentStatus?.needDatevSync || 0}
color={documentStatus?.needDatevSync > 0 ? "info" : "default"}
max={999999}
sx={{ mr: 0.5 }}
>
<EmailIcon fontSize="small" />
</Badge>
{processingStatus.datevSync && (
<CircularProgress size={14} color="inherit" />
)}
</Button>
</span>
</Tooltip>
<Button
color="inherit"
size="small"
onClick={() => this.handleProcessing('datev-upload')}
disabled={processingStatus.datevUpload || !documentStatus}
sx={{
minWidth: 'auto',
px: 1,
'&:hover': { backgroundColor: 'rgba(255, 255, 255, 0.1)' }
}}
title="Process Datev CSV upload"
>
<Badge
badgeContent={documentStatus?.needDatevUpload || 0}
color={documentStatus?.needDatevUpload > 0 ? "secondary" : "default"}
max={999999}
sx={{ mr: 0.5 }}
>
<UploadIcon fontSize="small" />
</Badge>
{processingStatus.datevUpload && <PlayArrowIcon fontSize="small" />}
</Button>
</Box>
{this.state.exportData && (
<Button
color="inherit"
@@ -184,22 +446,44 @@ class App extends Component {
</>
)}
</Toolbar>
{(processingStatus.markdown || processingStatus.extraction || processingStatus.datevSync) && (
<LinearProgress color="secondary" />
)}
</AppBar>
<Box sx={{ height: 'calc(100vh - 64px)', display: 'flex', flexDirection: 'column' }}>
<Container maxWidth={false} sx={{ mt: 4, flex: 1, minHeight: 0, display: 'flex', flexDirection: 'column', width: '100%' }}>
{isAuthenticated ? (
{this.isOAuthCallback() ? (
<OAuthCallback />
) : isAuthenticated ? (
<DataViewer
user={user}
onUpdateExportData={this.updateExportData}
currentView={currentView}
onViewChange={this.handleViewChange}
targetTab={this.state.targetTab}
/>
) : (
<Login onLogin={this.handleLogin} />
)}
</Container>
</Box>
<Snackbar
open={snackbar.open}
autoHideDuration={6000}
onClose={this.handleSnackbarClose}
anchorOrigin={{ vertical: 'bottom', horizontal: 'right' }}
>
<Alert
onClose={this.handleSnackbarClose}
severity={snackbar.severity}
variant="filled"
sx={{ width: '100%' }}
>
{snackbar.message}
</Alert>
</Snackbar>
</ThemeProvider>
);
}

View File

@@ -0,0 +1,550 @@
import React, { Component } from 'react';
import {
Box,
Typography,
Button,
Table,
TableBody,
TableCell,
TableContainer,
TableHead,
TableRow,
Paper,
TextField,
Select,
MenuItem,
FormControl,
InputLabel,
IconButton,
Dialog,
DialogTitle,
DialogContent,
DialogActions,
Alert,
Chip
} from '@mui/material';
import {
Add as AddIcon,
Delete as DeleteIcon,
Edit as EditIcon,
Save as SaveIcon,
Cancel as CancelIcon
} from '@mui/icons-material';
import AuthService from '../services/AuthService';
class AccountingItemsManager extends Component {
constructor(props) {
super(props);
this.state = {
accountingItems: [],
kontos: [],
bus: [],
loading: true,
editingItem: null,
showCreateDialog: false,
showCreateKontoDialog: false,
jtlKontierung: null,
newItem: {
umsatz_brutto: '',
soll_haben_kz: 'S',
konto: '',
bu: '',
rechnungsnummer: '',
buchungstext: ''
},
newKonto: {
konto: '',
name: ''
},
error: null,
saving: false
};
this.authService = new AuthService();
}
componentDidMount() {
this.loadData();
this.loadJtlKontierung();
}
loadData = async () => {
try {
// Load accounting items for this transaction
await this.loadAccountingItems();
// Load Konto and BU options
await Promise.all([
this.loadKontos(),
this.loadBUs()
]);
this.setState({ loading: false });
} catch (error) {
console.error('Error loading data:', error);
this.setState({
error: 'Fehler beim Laden der Daten',
loading: false
});
}
};
loadJtlKontierung = async () => {
try {
const { transaction } = this.props;
if (!transaction || !transaction.jtlId) {
this.setState({ jtlKontierung: undefined });
return;
}
const response = await this.authService.apiCall(`/data/jtl-kontierung/${transaction.jtlId}`);
if (!response) return;
if (response.ok) {
const data = await response.json();
this.setState({ jtlKontierung: data });
} else {
const err = await response.json();
console.error('Failed to load JTL Kontierung:', err);
this.setState({ jtlKontierung: undefined });
}
} catch (e) {
console.error('Error loading JTL Kontierung:', e);
this.setState({ jtlKontierung: undefined });
}
}
loadAccountingItems = async () => {
const { transaction } = this.props;
if (!transaction?.id) return;
try {
const response = await this.authService.apiCall(`/data/accounting-items/${transaction.id}`);
if (response && response.ok) {
const items = await response.json();
this.setState({ accountingItems: items });
}
} catch (error) {
console.error('Error loading accounting items:', error);
}
};
loadKontos = async () => {
try {
const response = await this.authService.apiCall('/data/kontos');
if (response && response.ok) {
const kontos = await response.json();
this.setState({ kontos });
}
} catch (error) {
console.error('Error loading kontos:', error);
}
};
loadBUs = async () => {
try {
const response = await this.authService.apiCall('/data/bus');
if (response && response.ok) {
const bus = await response.json();
this.setState({ bus });
}
} catch (error) {
console.error('Error loading BUs:', error);
}
};
handleCreateItem = () => {
const { transaction } = this.props;
this.setState({
showCreateDialog: true,
newItem: {
umsatz_brutto: Math.abs(transaction.numericAmount || 0).toString(),
soll_haben_kz: (transaction.numericAmount || 0) >= 0 ? 'H' : 'S',
konto: '',
bu: '',
rechnungsnummer: '',
buchungstext: transaction.description || ''
}
});
};
handleSaveItem = async () => {
const { transaction } = this.props;
const { newItem } = this.state;
if (!newItem.umsatz_brutto || !newItem.konto) {
this.setState({ error: 'Betrag und Konto sind erforderlich' });
return;
}
this.setState({ saving: true, error: null });
try {
const itemData = {
...newItem,
transaction_id: transaction.isFromCSV ? null : transaction.id,
csv_transaction_id: transaction.isFromCSV ? transaction.id : null,
buchungsdatum: transaction.parsed_date || new Date().toISOString().split('T')[0]
};
const response = await this.authService.apiCall('/data/accounting-items', {
method: 'POST',
body: JSON.stringify(itemData)
});
if (response && response.ok) {
await this.loadAccountingItems();
this.setState({
showCreateDialog: false,
saving: false,
newItem: {
umsatz_brutto: '',
soll_haben_kz: 'S',
konto: '',
bu: '',
rechnungsnummer: '',
buchungstext: ''
}
});
} else {
const errorData = await response.json();
this.setState({
error: errorData.error || 'Fehler beim Speichern',
saving: false
});
}
} catch (error) {
console.error('Error saving accounting item:', error);
this.setState({
error: 'Fehler beim Speichern',
saving: false
});
}
};
handleCreateKonto = async () => {
const { newKonto } = this.state;
if (!newKonto.konto || !newKonto.name) {
this.setState({ error: 'Konto-Nummer und Name sind erforderlich' });
return;
}
this.setState({ saving: true, error: null });
try {
const response = await this.authService.apiCall('/data/kontos', {
method: 'POST',
body: JSON.stringify(newKonto)
});
if (response && response.ok) {
await this.loadKontos();
this.setState({
showCreateKontoDialog: false,
saving: false,
newKonto: { konto: '', name: '' }
});
} else {
const errorData = await response.json();
this.setState({
error: errorData.error || 'Fehler beim Erstellen des Kontos',
saving: false
});
}
} catch (error) {
console.error('Error creating konto:', error);
this.setState({
error: 'Fehler beim Erstellen des Kontos',
saving: false
});
}
};
handleDeleteItem = async (itemId) => {
if (!window.confirm('Buchungsposten wirklich löschen?')) return;
try {
const response = await this.authService.apiCall(`/data/accounting-items/${itemId}`, {
method: 'DELETE'
});
if (response && response.ok) {
await this.loadAccountingItems();
}
} catch (error) {
console.error('Error deleting accounting item:', error);
this.setState({ error: 'Fehler beim Löschen' });
}
};
calculateTotal = () => {
return this.state.accountingItems.reduce((sum, item) => {
const amount = parseFloat(item.umsatz_brutto) || 0;
return sum + (item.soll_haben_kz === 'S' ? amount : -amount);
}, 0);
};
render() {
const { transaction } = this.props;
const {
accountingItems,
kontos,
bus,
loading,
showCreateDialog,
showCreateKontoDialog,
newItem,
newKonto,
error,
saving
} = this.state;
if (loading) {
return <Typography>Lade Buchungsdaten...</Typography>;
}
const transactionAmount = transaction.numericAmount || 0;
const currentTotal = this.calculateTotal();
const isBalanced = Math.abs(currentTotal - Math.abs(transactionAmount)) < 0.01;
return (
<Box>
<Box sx={{ display: 'flex', justifyContent: 'space-between', alignItems: 'center', mb: 2 }}>
<Typography variant="h6">
Buchungsposten
</Typography>
<Button
variant="contained"
startIcon={<AddIcon />}
onClick={this.handleCreateItem}
size="small"
>
Hinzufügen
</Button>
</Box>
{error && (
<Alert severity="error" sx={{ mb: 2 }}>
{error}
</Alert>
)}
<Box sx={{ mb: 2, p: 2, bgcolor: isBalanced ? '#e8f5e8' : '#fff3e0', borderRadius: 1 }}>
<Typography variant="body2">
<strong>Transaktionsbetrag:</strong> {Math.abs(transactionAmount).toFixed(2)}
</Typography>
<Typography variant="body2">
<strong>Summe Buchungsposten:</strong> {Math.abs(currentTotal).toFixed(2)}
</Typography>
<Chip
label={isBalanced ? "✅ Ausgeglichen" : "⚠️ Nicht ausgeglichen"}
color={isBalanced ? "success" : "warning"}
size="small"
sx={{ mt: 1 }}
/>
</Box>
{transaction?.jtlId && (
<Box sx={{ mb: 2, p: 2, border: '1px dashed #999', borderRadius: 1 }}>
<Typography variant="subtitle2">Debug: tUmsatzKontierung.data</Typography>
<Typography variant="caption" component="div" sx={{ whiteSpace: 'pre-wrap', wordBreak: 'break-word' }}>
{this.state.jtlKontierung === undefined
? 'undefined'
: this.state.jtlKontierung === null
? 'null'
: typeof this.state.jtlKontierung === 'object'
? JSON.stringify(this.state.jtlKontierung, null, 2)
: String(this.state.jtlKontierung)}
</Typography>
</Box>
)}
<TableContainer component={Paper}>
<Table size="small">
<TableHead>
<TableRow>
<TableCell>Betrag</TableCell>
<TableCell>S/H</TableCell>
<TableCell>Konto</TableCell>
<TableCell>BU</TableCell>
<TableCell>Buchungstext</TableCell>
<TableCell>Aktionen</TableCell>
</TableRow>
</TableHead>
<TableBody>
{accountingItems.map((item) => (
<TableRow key={item.id}>
<TableCell>{parseFloat(item.umsatz_brutto).toFixed(2)} </TableCell>
<TableCell>
<Chip
label={item.soll_haben_kz}
color={item.soll_haben_kz === 'S' ? 'primary' : 'secondary'}
size="small"
/>
</TableCell>
<TableCell>
{item.konto} - {item.konto_name}
</TableCell>
<TableCell>
{item.bu ? `${item.bu} - ${item.bu_name}` : '-'}
</TableCell>
<TableCell>{item.buchungstext || '-'}</TableCell>
<TableCell>
<IconButton
size="small"
onClick={() => this.handleDeleteItem(item.id)}
color="error"
>
<DeleteIcon />
</IconButton>
</TableCell>
</TableRow>
))}
{accountingItems.length === 0 && (
<TableRow>
<TableCell colSpan={6} align="center">
<Typography color="textSecondary">
Keine Buchungsposten vorhanden
</Typography>
</TableCell>
</TableRow>
)}
</TableBody>
</Table>
</TableContainer>
{/* Create Item Dialog */}
<Dialog open={showCreateDialog} onClose={() => this.setState({ showCreateDialog: false })} maxWidth="sm" fullWidth>
<DialogTitle>Neuen Buchungsposten erstellen</DialogTitle>
<DialogContent>
<Box sx={{ display: 'flex', flexDirection: 'column', gap: 2, mt: 1 }}>
<TextField
label="Betrag"
type="number"
value={newItem.umsatz_brutto}
onChange={(e) => this.setState({
newItem: { ...newItem, umsatz_brutto: e.target.value }
})}
required
fullWidth
/>
<FormControl fullWidth>
<InputLabel>Soll/Haben</InputLabel>
<Select
value={newItem.soll_haben_kz}
onChange={(e) => this.setState({
newItem: { ...newItem, soll_haben_kz: e.target.value }
})}
>
<MenuItem value="S">Soll (S)</MenuItem>
<MenuItem value="H">Haben (H)</MenuItem>
</Select>
</FormControl>
<Box sx={{ display: 'flex', gap: 1, alignItems: 'flex-end' }}>
<FormControl fullWidth>
<InputLabel>Konto</InputLabel>
<Select
value={newItem.konto}
onChange={(e) => this.setState({
newItem: { ...newItem, konto: e.target.value }
})}
>
{kontos.map((konto) => (
<MenuItem key={konto.id} value={konto.konto}>
{konto.konto} - {konto.name}
</MenuItem>
))}
</Select>
</FormControl>
<Button
variant="outlined"
onClick={() => this.setState({ showCreateKontoDialog: true })}
sx={{ minWidth: 'auto', px: 1 }}
>
<AddIcon />
</Button>
</Box>
<FormControl fullWidth>
<InputLabel>BU (Steuercode)</InputLabel>
<Select
value={newItem.bu}
onChange={(e) => this.setState({
newItem: { ...newItem, bu: e.target.value }
})}
>
<MenuItem value="">Kein BU</MenuItem>
{bus.map((bu) => (
<MenuItem key={bu.id} value={bu.bu}>
{bu.bu} - {bu.name} ({bu.vst}%)
</MenuItem>
))}
</Select>
</FormControl>
<TextField
label="Buchungstext"
value={newItem.buchungstext}
onChange={(e) => this.setState({
newItem: { ...newItem, buchungstext: e.target.value }
})}
fullWidth
multiline
rows={2}
/>
</Box>
</DialogContent>
<DialogActions>
<Button onClick={() => this.setState({ showCreateDialog: false })}>
Abbrechen
</Button>
<Button onClick={this.handleSaveItem} variant="contained" disabled={saving}>
{saving ? 'Speichern...' : 'Speichern'}
</Button>
</DialogActions>
</Dialog>
{/* Create Konto Dialog */}
<Dialog open={showCreateKontoDialog} onClose={() => this.setState({ showCreateKontoDialog: false })} maxWidth="xs" fullWidth>
<DialogTitle>Neues Konto erstellen</DialogTitle>
<DialogContent>
<Box sx={{ display: 'flex', flexDirection: 'column', gap: 2, mt: 1 }}>
<TextField
label="Konto-Nummer"
value={newKonto.konto}
onChange={(e) => this.setState({
newKonto: { ...newKonto, konto: e.target.value }
})}
required
fullWidth
/>
<TextField
label="Konto-Name"
value={newKonto.name}
onChange={(e) => this.setState({
newKonto: { ...newKonto, name: e.target.value }
})}
required
fullWidth
/>
</Box>
</DialogContent>
<DialogActions>
<Button onClick={() => this.setState({ showCreateKontoDialog: false })}>
Abbrechen
</Button>
<Button onClick={this.handleCreateKonto} variant="contained" disabled={saving}>
{saving ? 'Erstellen...' : 'Erstellen'}
</Button>
</DialogActions>
</Dialog>
</Box>
);
}
}
export default AccountingItemsManager;

View File

@@ -1,9 +1,5 @@
import React, { Component } from 'react';
import {
Dialog,
DialogTitle,
DialogContent,
DialogActions,
Button,
Typography,
Box,
@@ -11,45 +7,149 @@ import {
CircularProgress,
LinearProgress,
Chip,
Tabs,
Tab,
Divider,
Paper,
} from '@mui/material';
import {
CloudUpload as UploadIcon,
CheckCircle as SuccessIcon,
Error as ErrorIcon,
Link as LinkIcon,
AccountBalance as AccountIcon,
InfoOutlined as InfoIcon,
} from '@mui/icons-material';
import AuthService from '../services/AuthService';
class CSVImportDialog extends Component {
const IMPORT_TYPES = {
BANKING: 'BANKING',
DATEV_LINKS: 'DATEV_LINKS',
};
class CSVImportPanel extends Component {
constructor(props) {
super(props);
this.state = {
file: null,
csvData: null,
headers: null,
// common
activeTab: IMPORT_TYPES.BANKING,
importing: false,
imported: false,
importResult: null,
error: null,
// drag/drop visual
dragOver: false,
// banking state
file: null,
csvData: null,
headers: null,
// datev links state
datevFile: null,
datevCsvData: null,
datevHeaders: null,
};
this.authService = new AuthService();
this.fileInputRef = React.createRef();
this.datevFileInputRef = React.createRef();
}
componentDidMount() {
// Check if we should navigate to a specific tab
if (this.props.targetTab) {
this.setState({ activeTab: this.props.targetTab });
}
}
componentDidUpdate(prevProps) {
// Handle targetTab changes
if (this.props.targetTab !== prevProps.targetTab && this.props.targetTab) {
this.setState({ activeTab: this.props.targetTab });
}
}
// Tab switch resets type-specific state but keeps success state as-is
handleTabChange = (_e, value) => {
this.setState({
activeTab: value,
// clear type-specific selections and errors
file: null,
csvData: null,
headers: null,
datevFile: null,
datevCsvData: null,
datevHeaders: null,
error: null,
dragOver: false,
// keep importing false when switching
importing: false,
// keep imported/result to show success for last action regardless of tab
// Alternatively, uncomment next two lines to reset success on tab change:
// imported: false,
// importResult: null,
});
};
// Generic CSV parser (semicolon with quotes)
parseCSV = (text) => {
const lines = text.split('\n').filter(line => line.trim());
if (lines.length < 2) {
throw new Error('CSV-Datei muss mindestens eine Kopfzeile und eine Datenzeile enthalten');
}
const parseCSVLine = (line) => {
const result = [];
let current = '';
let inQuotes = false;
for (let i = 0; i < line.length; i++) {
const char = line[i];
if (char === '"') {
inQuotes = !inQuotes;
} else if (char === ';' && !inQuotes) {
result.push(current.trim());
current = '';
} else {
current += char;
}
}
result.push(current.trim());
return result;
};
const headers = parseCSVLine(lines[0]);
const dataRows = lines.slice(1).map(line => {
const values = parseCSVLine(line);
const row = {};
headers.forEach((header, index) => {
row[header] = values[index] || '';
});
return row;
});
return { headers, dataRows };
};
// Banking file handlers
handleFileSelect = (event) => {
const file = event.target.files[0];
if (file) {
this.processFile(file);
this.processFile(file, IMPORT_TYPES.BANKING);
}
};
// DATEV file handlers
handleDatevFileSelect = (event) => {
const file = event.target.files[0];
if (file) {
this.processFile(file, IMPORT_TYPES.DATEV_LINKS);
}
};
handleDrop = (event) => {
event.preventDefault();
this.setState({ dragOver: false });
const file = event.dataTransfer.files[0];
if (file) {
this.processFile(file);
// route to active tab
this.processFile(file, this.state.activeTab);
}
};
@@ -62,75 +162,49 @@ class CSVImportDialog extends Component {
this.setState({ dragOver: false });
};
processFile = (file) => {
processFile = (file, type) => {
if (!file.name.toLowerCase().endsWith('.csv')) {
this.setState({ error: 'Bitte wählen Sie eine CSV-Datei aus' });
return;
}
this.setState({ file, error: null, csvData: null, headers: null });
const reader = new FileReader();
reader.onload = (e) => {
try {
const text = e.target.result;
const lines = text.split('\n').filter(line => line.trim());
if (lines.length < 2) {
this.setState({ error: 'CSV-Datei muss mindestens eine Kopfzeile und eine Datenzeile enthalten' });
return;
}
// Parse CSV (simple parsing - assumes semicolon separator and quoted fields)
const parseCSVLine = (line) => {
const result = [];
let current = '';
let inQuotes = false;
for (let i = 0; i < line.length; i++) {
const char = line[i];
if (char === '"') {
inQuotes = !inQuotes;
} else if (char === ';' && !inQuotes) {
result.push(current.trim());
current = '';
} else {
current += char;
}
}
result.push(current.trim());
return result;
};
const headers = parseCSVLine(lines[0]);
const dataRows = lines.slice(1).map(line => {
const values = parseCSVLine(line);
const row = {};
headers.forEach((header, index) => {
row[header] = values[index] || '';
const { headers, dataRows } = this.parseCSV(text);
if (type === IMPORT_TYPES.BANKING) {
this.setState({
file,
csvData: dataRows,
headers,
error: null,
});
return row;
});
this.setState({
csvData: dataRows,
headers,
error: null
});
} catch (error) {
console.error('Error parsing CSV:', error);
this.setState({ error: 'Fehler beim Lesen der CSV-Datei' });
} else {
this.setState({
datevFile: file,
datevCsvData: dataRows,
datevHeaders: headers,
error: null,
});
}
} catch (err) {
console.error('Error parsing CSV:', err);
this.setState({ error: err.message || 'Fehler beim Lesen der CSV-Datei' });
}
};
reader.readAsText(file, 'UTF-8');
};
handleImport = async () => {
const { csvData, headers, file } = this.state;
if (!csvData || csvData.length === 0) {
const {
activeTab,
file, csvData, headers,
datevFile, datevCsvData, datevHeaders,
} = this.state;
const isBanking = activeTab === IMPORT_TYPES.BANKING;
const hasData = isBanking ? (csvData && csvData.length > 0) : (datevCsvData && datevCsvData.length > 0);
if (!hasData) {
this.setState({ error: 'Keine Daten zum Importieren gefunden' });
return;
}
@@ -138,155 +212,216 @@ class CSVImportDialog extends Component {
this.setState({ importing: true, error: null });
try {
const response = await this.authService.apiCall('/data/import-csv-transactions', {
method: 'POST',
body: JSON.stringify({
let endpoint = '';
let payload = {};
if (isBanking) {
endpoint = '/data/import-csv-transactions';
payload = {
transactions: csvData,
headers: headers,
filename: file.name,
batchId: `import_${Date.now()}_${file.name}`
})
batchId: `import_${Date.now()}_${file.name}`,
};
} else {
// Placeholder endpoint for DATEV Beleglinks (adjust when backend is available)
endpoint = '/data/import-datev-beleglinks';
payload = {
beleglinks: datevCsvData,
headers: datevHeaders,
filename: datevFile.name,
batchId: `datev_${Date.now()}_${datevFile.name}`,
};
}
const response = await this.authService.apiCall(endpoint, {
method: 'POST',
body: JSON.stringify(payload),
});
if (response && response.ok) {
const result = await response.json();
this.setState({
importing: false,
imported: true,
importResult: result
this.setState({
importing: false,
imported: true,
importResult: result,
});
if (this.props.onImportSuccess) {
this.props.onImportSuccess(result);
}
} else {
const errorData = await response.json();
this.setState({
importing: false,
error: errorData.error || 'Import fehlgeschlagen'
let errorText = 'Import fehlgeschlagen';
try {
const errorData = await response.json();
errorText = errorData.error || errorText;
} catch (_) {}
this.setState({
importing: false,
error: errorText,
});
}
} catch (error) {
console.error('Import error:', error);
this.setState({
importing: false,
error: 'Netzwerkfehler beim Import'
this.setState({
importing: false,
error: 'Netzwerkfehler beim Import',
});
}
};
handleClose = () => {
this.setState({
file: null,
csvData: null,
headers: null,
// common
importing: false,
imported: false,
importResult: null,
error: null,
dragOver: false,
// banking
file: null,
csvData: null,
headers: null,
// datev
datevFile: null,
datevCsvData: null,
datevHeaders: null,
});
if (this.props.onClose) {
this.props.onClose();
}
};
renderUploadPanel = ({ isBanking }) => {
const {
dragOver,
file, csvData, headers,
datevFile, datevCsvData, datevHeaders,
} = this.state;
const currentFile = isBanking ? file : datevFile;
const currentHeaders = isBanking ? headers : datevHeaders;
const currentData = isBanking ? csvData : datevCsvData;
const onClickPick = () => {
if (isBanking) {
this.fileInputRef.current?.click();
} else {
this.datevFileInputRef.current?.click();
}
};
return (
<>
<Box
sx={{
border: '2px dashed',
borderColor: dragOver ? 'primary.main' : 'grey.300',
borderRadius: 2,
p: 4,
textAlign: 'center',
bgcolor: dragOver ? 'action.hover' : 'background.paper',
cursor: 'pointer',
mb: 2,
}}
onDrop={this.handleDrop}
onDragOver={this.handleDragOver}
onDragLeave={this.handleDragLeave}
onClick={onClickPick}
>
<input
type="file"
accept=".csv"
onChange={isBanking ? this.handleFileSelect : this.handleDatevFileSelect}
ref={isBanking ? this.fileInputRef : this.datevFileInputRef}
style={{ display: 'none' }}
/>
{isBanking ? (
<AccountIcon sx={{ fontSize: 48, color: 'grey.400', mb: 2 }} />
) : (
<LinkIcon sx={{ fontSize: 48, color: 'grey.400', mb: 2 }} />
)}
<Typography variant="h6" gutterBottom>
{isBanking ? 'Bankkontoumsätze CSV hier ablegen oder klicken zum Auswählen' : 'DATEV Beleglinks CSV hier ablegen oder klicken zum Auswählen'}
</Typography>
<Typography variant="body2" color="textSecondary">
Unterstützte Formate: .csv (Semikolon-getrennt)
</Typography>
</Box>
{currentFile && (
<Box sx={{ mb: 2 }}>
<Typography variant="subtitle2" gutterBottom>
Ausgewählte Datei:
</Typography>
<Chip label={currentFile.name} color="primary" />
</Box>
)}
{currentHeaders && (
<Box sx={{ mb: 2 }}>
<Typography variant="subtitle2" gutterBottom>
Erkannte Spalten ({currentHeaders.length}):
</Typography>
<Box sx={{ display: 'flex', flexWrap: 'wrap', gap: 0.5 }}>
{currentHeaders.slice(0, 10).map((header, index) => (
<Chip key={index} label={header} size="small" variant="outlined" />
))}
{currentHeaders.length > 10 && (
<Chip label={`+${currentHeaders.length - 10} weitere`} size="small" />
)}
</Box>
</Box>
)}
{currentData && (
<Box sx={{ mb: 2 }}>
<Typography variant="subtitle2" gutterBottom>
{isBanking ? 'Gefundene Transaktionen' : 'Gefundene Beleglinks'}: {currentData.length}
</Typography>
<Typography variant="body2" color="textSecondary">
Die Daten werden validiert und in die Datenbank importiert.
</Typography>
</Box>
)}
</>
);
};
render() {
const { open } = this.props;
const {
file,
csvData,
headers,
importing,
imported,
importResult,
error,
dragOver
const {
activeTab,
importing,
imported,
importResult,
error,
csvData,
datevCsvData,
} = this.state;
const isBanking = activeTab === IMPORT_TYPES.BANKING;
const hasData = isBanking ? csvData : datevCsvData;
return (
<Dialog
open={open}
onClose={!importing ? this.handleClose : undefined}
maxWidth="md"
fullWidth
>
<DialogTitle>
CSV Transaktionen Importieren
</DialogTitle>
<Paper sx={{ p: 3 }}>
<Typography variant="h5" gutterBottom>
CSV Import
</Typography>
<Tabs
value={activeTab}
onChange={this.handleTabChange}
variant="fullWidth"
sx={{ borderBottom: 1, borderColor: 'divider', mb: 3 }}
>
<Tab value={IMPORT_TYPES.BANKING} iconPosition="start" icon={<AccountIcon />} label="Banking Umsätze" />
<Tab value={IMPORT_TYPES.DATEV_LINKS} iconPosition="start" icon={<LinkIcon />} label="DATEV Beleglinks" />
</Tabs>
<DialogContent>
<Box>
{!imported ? (
<>
{/* File Upload Area */}
<Box
sx={{
border: '2px dashed',
borderColor: dragOver ? 'primary.main' : 'grey.300',
borderRadius: 2,
p: 4,
textAlign: 'center',
bgcolor: dragOver ? 'action.hover' : 'background.paper',
cursor: 'pointer',
mb: 2,
}}
onDrop={this.handleDrop}
onDragOver={this.handleDragOver}
onDragLeave={this.handleDragLeave}
onClick={() => this.fileInputRef.current?.click()}
>
<input
type="file"
accept=".csv"
onChange={this.handleFileSelect}
ref={this.fileInputRef}
style={{ display: 'none' }}
/>
<UploadIcon sx={{ fontSize: 48, color: 'grey.400', mb: 2 }} />
<Typography variant="h6" gutterBottom>
CSV-Datei hier ablegen oder klicken zum Auswählen
</Typography>
<Typography variant="body2" color="textSecondary">
Unterstützte Formate: .csv (Semikolon-getrennt)
</Typography>
</Box>
{file && (
<Box sx={{ mb: 2 }}>
<Typography variant="subtitle2" gutterBottom>
Ausgewählte Datei:
</Typography>
<Chip label={file.name} color="primary" />
</Box>
)}
{headers && (
<Box sx={{ mb: 2 }}>
<Typography variant="subtitle2" gutterBottom>
Erkannte Spalten ({headers.length}):
</Typography>
<Box sx={{ display: 'flex', flexWrap: 'wrap', gap: 0.5 }}>
{headers.slice(0, 10).map((header, index) => (
<Chip key={index} label={header} size="small" variant="outlined" />
))}
{headers.length > 10 && (
<Chip label={`+${headers.length - 10} weitere`} size="small" />
)}
</Box>
</Box>
)}
{csvData && (
<Box sx={{ mb: 2 }}>
<Typography variant="subtitle2" gutterBottom>
Gefundene Transaktionen: {csvData.length}
</Typography>
<Typography variant="body2" color="textSecondary">
Die Daten werden validiert und in die Datenbank importiert.
</Typography>
</Box>
)}
{this.renderUploadPanel({ isBanking })}
{error && (
<Alert severity="error" sx={{ mb: 2 }}>
@@ -298,13 +433,12 @@ class CSVImportDialog extends Component {
<Box sx={{ mb: 2 }}>
<LinearProgress />
<Typography variant="body2" sx={{ mt: 1, textAlign: 'center' }}>
Importiere Transaktionen...
{isBanking ? 'Importiere Transaktionen...' : 'Importiere DATEV Beleglinks...'}
</Typography>
</Box>
)}
</>
) : (
/* Import Success */
<Box sx={{ textAlign: 'center', py: 2 }}>
<SuccessIcon sx={{ fontSize: 64, color: 'success.main', mb: 2 }} />
<Typography variant="h6" gutterBottom>
@@ -314,11 +448,21 @@ class CSVImportDialog extends Component {
{importResult && (
<Box sx={{ mt: 2 }}>
<Typography variant="body1" gutterBottom>
<strong>Importiert:</strong> {importResult.imported} Transaktionen
<strong>Hinzugefügt:</strong> {importResult.imported} {isBanking ? 'Transaktionen' : 'Datevlinks'}
</Typography>
{importResult.skipped > 0 && (
<Typography variant="body1" color="info.main">
<strong>Übersprungen:</strong> {importResult.skipped} Zeilen (bereits vorhanden, unbekanntes Format, etc.)
</Typography>
)}
{importResult.errors > 0 && (
<Typography variant="body1" color="warning.main">
<strong>Fehler:</strong> {importResult.errors} Zeilen übersprungen
<strong>Fehler:</strong> {importResult.errors} Zeilen konnten nicht verarbeitet werden.
</Typography>
)}
{importResult.message && (
<Typography variant="body2" color="textSecondary" sx={{ mt: 1 }}>
{importResult.message}
</Typography>
)}
<Typography variant="body2" color="textSecondary" sx={{ mt: 1 }}>
@@ -328,26 +472,31 @@ class CSVImportDialog extends Component {
)}
</Box>
)}
</DialogContent>
<DialogActions>
<Button onClick={this.handleClose} disabled={importing}>
{imported ? 'Schließen' : 'Abbrechen'}
</Button>
{!imported && csvData && (
<Button
onClick={this.handleImport}
variant="contained"
disabled={importing || !csvData}
startIcon={importing ? <CircularProgress size={16} /> : <UploadIcon />}
>
{importing ? 'Importiere...' : 'Importieren'}
</Button>
{!imported && hasData && (
<Box sx={{ mt: 3, textAlign: 'center' }}>
<Button
onClick={this.handleImport}
variant="contained"
size="large"
disabled={importing || !hasData}
startIcon={importing ? <CircularProgress size={16} /> : <UploadIcon />}
>
{importing ? 'Importiere...' : 'Importieren'}
</Button>
</Box>
)}
</DialogActions>
</Dialog>
{imported && (
<Box sx={{ mt: 3, textAlign: 'center' }}>
<Button onClick={this.handleClose} variant="outlined" size="large">
Neuer Import
</Button>
</Box>
)}
</Box>
</Paper>
);
}
}
export default CSVImportDialog;
export default CSVImportPanel;

View File

@@ -165,7 +165,7 @@ class DataViewer extends Component {
</>
) : (
<Box sx={{ flex: 1, minHeight: 0, overflow: 'auto', p: 2 }}>
<TableManagement user={user} />
<TableManagement user={user} targetTab={this.props.targetTab} />
</Box>
)}
</Box>

View File

@@ -9,6 +9,10 @@ class Login extends Component {
error: null,
loading: false,
};
// Flags to track FedCM attempts and success
this.fedcmAttempted = false;
this.fedcmSucceeded = false;
}
componentDidMount() {
@@ -34,11 +38,17 @@ class Login extends Component {
initializeGoogleSignIn = () => {
if (window.google && window.google.accounts) {
try {
// Note: Removed debug logging to avoid deprecated method warnings
console.log('REACT_APP_GOOGLE_CLIENT_ID', process.env.REACT_APP_GOOGLE_CLIENT_ID);
console.log('Current origin for Google auth:', window.location.origin);
console.log('User agent:', navigator.userAgent);
window.google.accounts.id.initialize({
client_id: process.env.REACT_APP_GOOGLE_CLIENT_ID || 'your_google_client_id_here',
client_id: process.env.REACT_APP_GOOGLE_CLIENT_ID,
callback: this.handleGoogleResponse,
auto_select: false,
cancel_on_tap_outside: true,
cancel_on_tap_outside: false,
});
console.log('✅ Google Sign-In initialized');
} catch (error) {
@@ -48,6 +58,9 @@ class Login extends Component {
};
handleGoogleResponse = (response) => {
// Mark FedCM as successful if we get here
this.fedcmSucceeded = true;
this.setState({ loading: true, error: null });
this.props.onLogin(response)
.catch((error) => {
@@ -70,6 +83,11 @@ class Login extends Component {
errorMessage = '🚫 Zugriff verweigert: Ihre E-Mail-Adresse ist nicht autorisiert. Versuchen Sie, sich mit einem anderen Google-Konto anzumelden.';
} else if (error.message.includes('No authorized users configured')) {
errorMessage = '🔒 Kein Zugriff: Derzeit sind keine Benutzer autorisiert. Wenden Sie sich an den Administrator.';
} else if (error.message.includes('Not signed in with the identity provider') ||
error.message.includes('NetworkError') ||
error.message.includes('FedCM')) {
// FedCM failed, offer redirect option
errorMessage = '🔄 Schnelle Anmeldung nicht verfügbar. Versuchen Sie die Standard-Anmeldung.';
} else {
// Show the actual error message from the server
errorMessage = `❌ Anmeldefehler: ${error.message}`;
@@ -92,29 +110,105 @@ class Login extends Component {
return;
}
// Clear any previous error
this.setState({ error: null, loading: false });
// Clear any previous error and start loading
this.setState({ error: null, loading: true });
// Try FedCM first (seamless for users already signed in to Google)
console.log('🎯 Trying FedCM first for optimal UX...');
this.tryFedCMFirst();
};
tryFedCMFirst = () => {
if (window.google && window.google.accounts && window.google.accounts.id) {
try {
window.google.accounts.id.prompt();
} catch (error) {
console.error('Google prompt error:', error);
this.setState({
error: 'Google-Anmeldung konnte nicht geladen werden. Die Seite wird aktualisiert, um es erneut zu versuchen.',
loading: true
console.log('✅ Trying FedCM for seamless sign-in...');
// Listen for the specific FedCM errors that indicate no Google session
const originalConsoleError = console.error;
let errorIntercepted = false;
console.error = (...args) => {
const errorMessage = args.join(' ');
if (!errorIntercepted && (
errorMessage.includes('Not signed in with the identity provider') ||
errorMessage.includes('FedCM get() rejects with NetworkError') ||
errorMessage.includes('Error retrieving a token')
)) {
errorIntercepted = true;
console.error = originalConsoleError; // Restore immediately
console.log('🔄 FedCM failed (user not signed in to Google), using redirect...');
this.redirectToGoogleOAuth();
return;
}
originalConsoleError.apply(console, args);
};
// Try FedCM
window.google.accounts.id.prompt((notification) => {
console.log('🔍 FedCM notification:', notification);
console.error = originalConsoleError; // Restore console.error
// If we get here without error, FedCM is working
});
setTimeout(() => window.location.reload(), 2000);
} catch (error) {
console.error('FedCM initialization error, falling back to redirect:', error);
this.redirectToGoogleOAuth();
}
} else {
this.setState({
error: 'Google-Anmeldung nicht geladen. Die Seite wird aktualisiert, um es erneut zu versuchen.',
loading: true
});
setTimeout(() => window.location.reload(), 2000);
// Google Identity Services not loaded, go straight to redirect
console.log('📋 GSI not loaded, using redirect flow...');
this.redirectToGoogleOAuth();
}
};
redirectToGoogleOAuth = () => {
try {
// Generate a random state parameter for security
const state = this.generateRandomString(32);
sessionStorage.setItem('oauth_state', state);
// Build the Google OAuth2 authorization URL
const params = new URLSearchParams({
client_id: process.env.REACT_APP_GOOGLE_CLIENT_ID,
redirect_uri: window.location.origin + '/auth/callback',
response_type: 'code',
scope: 'openid email profile',
state: state,
access_type: 'online',
prompt: 'select_account'
});
const authUrl = `https://accounts.google.com/o/oauth2/v2/auth?${params.toString()}`;
console.log('🔗 Redirecting to Google OAuth:', authUrl);
// Redirect to Google OAuth
window.location.href = authUrl;
} catch (error) {
console.error('Redirect OAuth error:', error);
this.setState({
error: 'Google-Anmeldung konnte nicht gestartet werden.',
loading: false
});
}
};
generateRandomString = (length) => {
const chars = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789';
let result = '';
for (let i = 0; i < length; i++) {
result += chars.charAt(Math.floor(Math.random() * chars.length));
}
return result;
};
handleUseRedirect = () => {
console.log('🔄 User chose redirect flow');
this.setState({ error: null, loading: true });
this.redirectToGoogleOAuth();
};
@@ -157,6 +251,20 @@ class Login extends Component {
{loading ? 'Anmeldung läuft...' : 'Mit Google anmelden'}
</Button>
{error && error.includes('Standard-Anmeldung') && (
<Button
fullWidth
variant="outlined"
size="large"
startIcon={<GoogleIcon />}
onClick={this.handleUseRedirect}
disabled={loading}
sx={{ py: 1.5, mt: 2 }}
>
Standard Google-Anmeldung verwenden
</Button>
)}
<Typography variant="caption" display="block" textAlign="center" sx={{ mt: 2 }}>

View File

@@ -0,0 +1,133 @@
import React, { Component } from 'react';
import { Box, CircularProgress, Typography, Alert } from '@mui/material';
class OAuthCallback extends Component {
constructor(props) {
super(props);
this.state = {
loading: true,
error: null,
};
}
componentDidMount() {
this.handleOAuthCallback();
}
handleOAuthCallback = async () => {
try {
const urlParams = new URLSearchParams(window.location.search);
const code = urlParams.get('code');
const state = urlParams.get('state');
const error = urlParams.get('error');
// Check for OAuth errors
if (error) {
throw new Error(`OAuth error: ${error}`);
}
// Verify state parameter for security
const storedState = sessionStorage.getItem('oauth_state');
if (!state || state !== storedState) {
throw new Error('Invalid state parameter - possible CSRF attack');
}
// Clear stored state
sessionStorage.removeItem('oauth_state');
if (!code) {
throw new Error('No authorization code received');
}
console.log('🔑 Authorization code received, exchanging for tokens...');
// Exchange authorization code for tokens via our backend
const response = await fetch('/api/auth/google/callback', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
code: code,
redirect_uri: window.location.origin + '/auth/callback'
}),
});
if (!response.ok) {
const errorData = await response.json().catch(() => ({}));
throw new Error(errorData.message || `HTTP ${response.status}: ${response.statusText}`);
}
const data = await response.json();
if (data.success && data.token) {
console.log('✅ OAuth callback successful');
// Store the JWT token
localStorage.setItem('token', data.token);
// Redirect to main app
window.location.href = '/';
} else {
throw new Error(data.message || 'Authentication failed');
}
} catch (error) {
console.error('OAuth callback error:', error);
this.setState({
loading: false,
error: error.message || 'Authentication failed'
});
}
};
render() {
const { loading, error } = this.state;
if (error) {
return (
<Box
display="flex"
justifyContent="center"
alignItems="center"
minHeight="60vh"
flexDirection="column"
>
<Alert severity="error" sx={{ mb: 2, maxWidth: 400 }}>
<Typography variant="h6" gutterBottom>
Anmeldung fehlgeschlagen
</Typography>
<Typography variant="body2">
{error}
</Typography>
</Alert>
<Typography variant="body2" color="textSecondary">
<a href="/" style={{ color: 'inherit' }}>
Zurück zur Anmeldung
</a>
</Typography>
</Box>
);
}
return (
<Box
display="flex"
justifyContent="center"
alignItems="center"
minHeight="60vh"
flexDirection="column"
>
<CircularProgress size={60} sx={{ mb: 2 }} />
<Typography variant="h6" gutterBottom>
Anmeldung wird verarbeitet...
</Typography>
<Typography variant="body2" color="textSecondary">
Sie werden automatisch weitergeleitet.
</Typography>
</Box>
);
}
}
export default OAuthCallback;

View File

@@ -15,7 +15,7 @@ import {
import KreditorTable from './admin/KreditorTable';
import KontoTable from './admin/KontoTable';
import BUTable from './admin/BUTable';
import CSVImportDialog from './CSVImportDialog';
import CSVImportPanel from './CSVImportDialog';
class TableManagement extends Component {
constructor(props) {
@@ -26,6 +26,21 @@ class TableManagement extends Component {
};
}
componentDidMount() {
// Check if we should navigate to a specific tab
if (this.props.targetTab?.level1 !== undefined) {
this.setState({ activeTab: this.props.targetTab.level1 });
}
}
componentDidUpdate(prevProps) {
// Handle targetTab changes
if (this.props.targetTab?.level1 !== prevProps.targetTab?.level1 &&
this.props.targetTab?.level1 !== undefined) {
this.setState({ activeTab: this.props.targetTab.level1 });
}
}
handleTabChange = (event, newValue) => {
this.setState({ activeTab: newValue });
};
@@ -90,10 +105,9 @@ class TableManagement extends Component {
<Typography variant="body2" color="text.secondary" paragraph>
Hier können Sie CSV-Dateien von Ihrer Bank importieren. Die Daten werden in die Datenbank gespeichert und können dann Banking-Konten zugeordnet werden.
</Typography>
<CSVImportDialog
open={true}
onClose={() => {}} // Always open in this tab
<CSVImportPanel
user={user}
targetTab={this.props.targetTab?.level2}
/>
</Box>
)}

View File

@@ -29,6 +29,7 @@ import {
import { AgGridReact } from 'ag-grid-react';
import KreditorSelector from '../KreditorSelector';
import BankingKreditorSelector from '../BankingKreditorSelector';
import AccountingItemsManager from '../AccountingItemsManager';
const DocumentRenderer = (params) => {
// Check for pdfs and links regardless of transaction source
@@ -466,30 +467,27 @@ const DocumentRenderer = (params) => {
)}
{tabValue === 1 && (
<Box sx={{ p: 2, height: 500 }}>
{lineItems.length > 0 ? (
<div style={{ height: '100%', width: '100%' }}>
<AgGridReact
columnDefs={columnDefs}
rowData={lineItems}
defaultColDef={defaultColDef}
suppressRowTransform={true}
rowHeight={50}
headerHeight={35}
domLayout="normal"
/>
</div>
) : (
<Box sx={{ textAlign: 'center', py: 4 }}>
<Typography variant="h6" color="textSecondary" gutterBottom>
Keine Buchungsdaten verfügbar
</Typography>
<Typography variant="body2" color="textSecondary">
{hasDocuments
? 'In den vorhandenen Dokumenten wurden keine Buchungsdaten gefunden.'
: 'Keine Dokumente vorhanden, daher keine Buchungsdaten verfügbar.'
}
<Box sx={{ p: 2 }}>
{/* Accounting Items Manager */}
<AccountingItemsManager transaction={params.data} />
{/* Document Line Items (if any) */}
{lineItems.length > 0 && (
<Box sx={{ mt: 3 }}>
<Typography variant="h6" gutterBottom>
Erkannte Positionen aus Dokumenten
</Typography>
<div style={{ height: '300px', width: '100%' }}>
<AgGridReact
columnDefs={columnDefs}
rowData={lineItems}
defaultColDef={defaultColDef}
suppressRowTransform={true}
rowHeight={50}
headerHeight={35}
domLayout="normal"
/>
</div>
</Box>
)}
</Box>

View File

@@ -1,56 +0,0 @@
version: '3.8'
services:
nginx:
image: nginx:alpine
ports:
- "80:80"
volumes:
- ./nginx.dev.conf:/etc/nginx/conf.d/default.conf
- ./logs/nginx:/var/log/nginx
depends_on:
- frontend
- backend
restart: unless-stopped
networks:
- fibdash-network
frontend:
build:
context: .
dockerfile: Dockerfile.dev.frontend
ports:
- "5001:5001"
volumes:
- ./client:/app/client
- /app/node_modules
environment:
- NODE_ENV=development
- CHOKIDAR_USEPOLLING=true
networks:
- fibdash-network
command: npm run dev:frontend
backend:
build:
context: .
dockerfile: Dockerfile.dev.backend
ports:
- "5000:5000"
volumes:
- ./src:/app/src
- /app/node_modules
environment:
- NODE_ENV=development
env_file:
- .env
networks:
- fibdash-network
command: npm run dev:backend
networks:
fibdash-network:
driver: bridge
volumes:
node_modules:

34
package-lock.json generated
View File

@@ -21,6 +21,8 @@
"google-auth-library": "^9.0.0",
"jsonwebtoken": "^9.0.0",
"mssql": "^9.1.0",
"nodemailer": "^7.0.5",
"openai": "^5.12.0",
"react": "^18.2.0",
"react-dom": "^18.2.0"
},
@@ -7142,6 +7144,15 @@
"dev": true,
"license": "MIT"
},
"node_modules/nodemailer": {
"version": "7.0.5",
"resolved": "https://registry.npmjs.org/nodemailer/-/nodemailer-7.0.5.tgz",
"integrity": "sha512-nsrh2lO3j4GkLLXoeEksAMgAOqxOv6QumNRVQTJwKH4nuiww6iC2y7GyANs9kRAxCexg3+lTWM3PZ91iLlVjfg==",
"license": "MIT-0",
"engines": {
"node": ">=6.0.0"
}
},
"node_modules/nodemon": {
"version": "3.1.10",
"resolved": "https://registry.npmjs.org/nodemon/-/nodemon-3.1.10.tgz",
@@ -7365,6 +7376,27 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/openai": {
"version": "5.12.0",
"resolved": "https://registry.npmjs.org/openai/-/openai-5.12.0.tgz",
"integrity": "sha512-vUdt02xiWgOHiYUmW0Hj1Qu9OKAiVQu5Bd547ktVCiMKC1BkB5L3ImeEnCyq3WpRKR6ZTaPgekzqdozwdPs7Lg==",
"license": "Apache-2.0",
"bin": {
"openai": "bin/cli"
},
"peerDependencies": {
"ws": "^8.18.0",
"zod": "^3.23.8"
},
"peerDependenciesMeta": {
"ws": {
"optional": true
},
"zod": {
"optional": true
}
}
},
"node_modules/own-keys": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/own-keys/-/own-keys-1.0.1.tgz",
@@ -9827,7 +9859,7 @@
"version": "8.18.3",
"resolved": "https://registry.npmjs.org/ws/-/ws-8.18.3.tgz",
"integrity": "sha512-PEIGCY5tSlUt50cqyMXfCzX+oOPqN0vuGqWzbcJ2xvnkzkq46oOpz7dQaTDBdfICb4N14+GARUDw2XV2N4tvzg==",
"dev": true,
"devOptional": true,
"license": "MIT",
"engines": {
"node": ">=10.0.0"

View File

@@ -6,7 +6,7 @@
"scripts": {
"dev": "concurrently \"npm run dev:frontend\" \"npm run dev:backend\"",
"dev:frontend": "webpack serve --mode development --config webpack.config.js",
"dev:backend": "nodemon src/index.js",
"dev:backend": "nodemon --exitcrash src/index.js",
"build": "webpack --config webpack.prod.config.js",
"build:prod": "npm run build && npm run start:prod",
"start": "npm run build && node src/index.js",
@@ -32,6 +32,8 @@
"google-auth-library": "^9.0.0",
"jsonwebtoken": "^9.0.0",
"mssql": "^9.1.0",
"nodemailer": "^7.0.5",
"openai": "^5.12.0",
"react": "^18.2.0",
"react-dom": "^18.2.0"
},

View File

@@ -73,10 +73,42 @@ const executeQuery = async (query, params = {}) => {
}
};
const executeTransaction = async (callback) => {
if (!process.env.DB_SERVER) {
throw new Error('Database not configured');
}
let pool;
let transaction;
try {
pool = await getPool();
transaction = new sql.Transaction(pool);
await transaction.begin();
const result = await callback(transaction);
await transaction.commit();
return result;
} catch (error) {
if (transaction) {
try {
await transaction.rollback();
} catch (rollbackError) {
console.error('Transaction rollback failed:', rollbackError);
}
}
console.error('Transaction error:', error);
throw error;
}
};
module.exports = {
config,
getPool,
testConnection,
executeQuery,
executeTransaction,
sql,
};

View File

@@ -10,7 +10,7 @@ const dataRoutes = require('./routes/data');
const dbConfig = require('./config/database');
const app = express();
const PORT = process.env.PORT || 5000;
const PORT = process.env.PORT || 5500;
// Middleware
app.use(cors());

View File

@@ -76,6 +76,98 @@ router.post('/google', async (req, res) => {
}
});
// Google OAuth callback (redirect flow)
router.post('/google/callback', async (req, res) => {
try {
const { code, redirect_uri } = req.body;
console.log('🔄 Processing OAuth callback with authorization code');
if (!code) {
console.log('❌ No authorization code provided');
return res.status(400).json({ error: 'Authorization code is required' });
}
// Exchange authorization code for tokens
const tokenResponse = await fetch('https://oauth2.googleapis.com/token', {
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
},
body: new URLSearchParams({
client_id: process.env.GOOGLE_CLIENT_ID,
client_secret: process.env.GOOGLE_CLIENT_SECRET,
code: code,
grant_type: 'authorization_code',
redirect_uri: redirect_uri,
}),
});
if (!tokenResponse.ok) {
const errorData = await tokenResponse.text();
console.log('❌ Token exchange failed:', errorData);
return res.status(400).json({ error: 'Failed to exchange authorization code' });
}
const tokens = await tokenResponse.json();
console.log('🎯 Received tokens from Google');
// Use the ID token to get user info
const ticket = await client.verifyIdToken({
idToken: tokens.id_token,
audience: process.env.GOOGLE_CLIENT_ID,
});
const payload = ticket.getPayload();
const googleId = payload['sub'];
const email = payload['email'];
const name = payload['name'];
const picture = payload['picture'];
console.log(`👤 OAuth callback verified for: ${email}`);
// Check if email is authorized
const authorized = await isEmailAuthorized(email);
console.log(`🔒 Email authorization check for ${email}: ${authorized ? 'ALLOWED' : 'DENIED'}`);
if (!authorized) {
console.log(`❌ Access denied for ${email}`);
return res.status(403).json({
error: 'Access denied',
message: 'Your email address is not authorized to access this application'
});
}
// Create user object
const user = {
id: googleId,
email,
name,
picture,
google_id: googleId,
};
console.log('✅ User object created from OAuth callback');
// Generate JWT token
const jwtToken = generateToken(user);
res.json({
success: true,
token: jwtToken,
user: {
id: user.id,
email: user.email,
name: user.name,
picture: user.picture,
},
});
} catch (error) {
console.error('OAuth callback error:', error);
res.status(401).json({ error: 'OAuth authentication failed' });
}
});
// Verify JWT token
router.get('/verify', authenticateToken, async (req, res) => {
try {

View File

@@ -0,0 +1,273 @@
const express = require('express');
const { authenticateToken } = require('../../middleware/auth');
const router = express.Router();
// Debug: Get JTL Kontierung data for a specific JTL Umsatz (by kZahlungsabgleichUmsatz)
router.get('/jtl-kontierung/:jtlId', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../../config/database');
const { jtlId } = req.params;
const query = `
SELECT
uk.data
FROM eazybusiness.dbo.tZahlungsabgleichUmsatz z
LEFT JOIN eazybusiness.dbo.tUmsatzKontierung uk
ON uk.kZahlungsabgleichUmsatz = z.kZahlungsabgleichUmsatz
WHERE z.kZahlungsabgleichUmsatz = @jtlId
`;
const result = await executeQuery(query, { jtlId: parseInt(jtlId, 10) });
// Return undefined when no data found (do not lie with empty array/string)
if (!result.recordset || result.recordset.length === 0) {
return res.json({ data: undefined });
}
// If multiple rows exist, return all; otherwise single object
const rows = result.recordset.map(r => ({ data: r.data }));
if (rows.length === 1) {
return res.json(rows[0]);
}
return res.json(rows);
} catch (error) {
console.error('Error fetching JTL Kontierung data:', error);
res.status(500).json({ error: 'Failed to fetch JTL Kontierung data' });
}
});
// Get accounting items for a specific transaction
router.get('/accounting-items/:transactionId', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../../config/database');
const { transactionId } = req.params;
// Try both numeric and string format (similar to banking transactions)
let query, params;
const numericId = parseInt(transactionId, 10);
if (!isNaN(numericId) && numericId.toString() === transactionId) {
// It's a numeric ID - check transaction_id column
query = `
SELECT
ai.*,
k.name as konto_name,
bu.name as bu_name,
bu.vst as bu_vst
FROM fibdash.AccountingItems ai
LEFT JOIN fibdash.Konto k ON ai.konto = k.konto
LEFT JOIN fibdash.BU bu ON ai.bu = bu.bu
WHERE ai.transaction_id = @transactionId
ORDER BY ai.id
`;
params = { transactionId: numericId };
} else {
// It's a string ID - check csv_transaction_id column
query = `
SELECT
ai.*,
k.name as konto_name,
bu.name as bu_name,
bu.vst as bu_vst
FROM fibdash.AccountingItems ai
LEFT JOIN fibdash.Konto k ON ai.konto = k.konto
LEFT JOIN fibdash.BU bu ON ai.bu = bu.bu
WHERE ai.csv_transaction_id = @transactionId
ORDER BY ai.id
`;
params = { transactionId };
}
const result = await executeQuery(query, params);
res.json(result.recordset);
} catch (error) {
console.error('Error fetching accounting items:', error);
res.status(500).json({ error: 'Failed to fetch accounting items' });
}
});
// Create accounting item for a transaction
router.post('/accounting-items', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../../config/database');
const {
transaction_id,
csv_transaction_id,
umsatz_brutto,
soll_haben_kz,
konto,
bu,
buchungsdatum,
rechnungsnummer,
buchungstext
} = req.body;
if ((!transaction_id && !csv_transaction_id) || !umsatz_brutto || !soll_haben_kz || !konto || !buchungsdatum) {
return res.status(400).json({
error: 'Transaction ID, amount, debit/credit indicator, account, and booking date are required'
});
}
let insertQuery, queryParams;
if (csv_transaction_id) {
// For CSV transactions, use placeholder transaction_id
insertQuery = `
INSERT INTO fibdash.AccountingItems
(transaction_id, csv_transaction_id, umsatz_brutto, soll_haben_kz, konto, gegenkonto, bu, buchungsdatum, rechnungsnummer, buchungstext)
OUTPUT INSERTED.*
VALUES (-1, @csv_transaction_id, @umsatz_brutto, @soll_haben_kz, @konto, '', @bu, @buchungsdatum, @rechnungsnummer, @buchungstext)
`;
queryParams = {
csv_transaction_id,
umsatz_brutto,
soll_haben_kz,
konto,
bu: bu || null,
buchungsdatum,
rechnungsnummer: rechnungsnummer || null,
buchungstext: buchungstext || null
};
} else {
// For regular transactions
insertQuery = `
INSERT INTO fibdash.AccountingItems
(transaction_id, csv_transaction_id, umsatz_brutto, soll_haben_kz, konto, gegenkonto, bu, buchungsdatum, rechnungsnummer, buchungstext)
OUTPUT INSERTED.*
VALUES (@transaction_id, NULL, @umsatz_brutto, @soll_haben_kz, @konto, '', @bu, @buchungsdatum, @rechnungsnummer, @buchungstext)
`;
queryParams = {
transaction_id,
umsatz_brutto,
soll_haben_kz,
konto,
bu: bu || null,
buchungsdatum,
rechnungsnummer: rechnungsnummer || null,
buchungstext: buchungstext || null
};
}
const result = await executeQuery(insertQuery, queryParams);
res.status(201).json(result.recordset[0]);
} catch (error) {
console.error('Error creating accounting item:', error);
res.status(500).json({ error: 'Failed to create accounting item' });
}
});
// Update accounting item
router.put('/accounting-items/:id', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../../config/database');
const { id } = req.params;
const { umsatz_brutto, soll_haben_kz, konto, bu, rechnungsnummer, buchungstext } = req.body;
if (!umsatz_brutto || !soll_haben_kz || !konto) {
return res.status(400).json({ error: 'Amount, debit/credit indicator, and account are required' });
}
const updateQuery = `
UPDATE fibdash.AccountingItems
SET umsatz_brutto = @umsatz_brutto,
soll_haben_kz = @soll_haben_kz,
konto = @konto,
bu = @bu,
rechnungsnummer = @rechnungsnummer,
buchungstext = @buchungstext
OUTPUT INSERTED.*
WHERE id = @id
`;
const result = await executeQuery(updateQuery, {
umsatz_brutto,
soll_haben_kz,
konto,
bu: bu || null,
rechnungsnummer: rechnungsnummer || null,
buchungstext: buchungstext || null,
id: parseInt(id, 10)
});
if (result.recordset.length === 0) {
return res.status(404).json({ error: 'Accounting item not found' });
}
res.json(result.recordset[0]);
} catch (error) {
console.error('Error updating accounting item:', error);
res.status(500).json({ error: 'Failed to update accounting item' });
}
});
// Delete accounting item
router.delete('/accounting-items/:id', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../../config/database');
const { id } = req.params;
const deleteQuery = `DELETE FROM fibdash.AccountingItems WHERE id = @id`;
await executeQuery(deleteQuery, { id: parseInt(id, 10) });
res.json({ message: 'Accounting item deleted successfully' });
} catch (error) {
console.error('Error deleting accounting item:', error);
res.status(500).json({ error: 'Failed to delete accounting item' });
}
});
// Get all Konto options
router.get('/kontos', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../../config/database');
const query = `SELECT * FROM fibdash.Konto ORDER BY konto`;
const result = await executeQuery(query);
res.json(result.recordset);
} catch (error) {
console.error('Error fetching kontos:', error);
res.status(500).json({ error: 'Failed to fetch kontos' });
}
});
// Create new Konto
router.post('/kontos', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../../config/database');
const { konto, name } = req.body;
if (!konto || !name) {
return res.status(400).json({ error: 'Konto and name are required' });
}
const insertQuery = `
INSERT INTO fibdash.Konto (konto, name)
OUTPUT INSERTED.*
VALUES (@konto, @name)
`;
const result = await executeQuery(insertQuery, { konto, name });
res.status(201).json(result.recordset[0]);
} catch (error) {
console.error('Error creating konto:', error);
res.status(500).json({ error: 'Failed to create konto' });
}
});
// Get all BU options
router.get('/bus', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../../config/database');
const query = `SELECT * FROM fibdash.BU ORDER BY bu`;
const result = await executeQuery(query);
res.json(result.recordset);
} catch (error) {
console.error('Error fetching BUs:', error);
res.status(500).json({ error: 'Failed to fetch BUs' });
}
});
module.exports = router;

View File

@@ -71,6 +71,37 @@ router.post('/test-csv-import', async (req, res) => {
numericAmount = parseFloat(normalizedAmount) || 0;
}
// Check for existing transaction to prevent duplicates
const duplicateCheckQuery = `
SELECT COUNT(*) as count FROM fibdash.CSVTransactions
WHERE buchungstag = @buchungstag
AND wertstellung = @wertstellung
AND umsatzart = @umsatzart
AND betrag = @betrag
AND beguenstigter_zahlungspflichtiger = @beguenstigter_zahlungspflichtiger
AND verwendungszweck = @verwendungszweck
`;
const duplicateCheckResult = await executeQuery(duplicateCheckQuery, {
buchungstag: transaction['Buchungstag'] || null,
wertstellung: transaction['Valutadatum'] || null,
umsatzart: transaction['Buchungstext'] || null,
betrag: numericAmount,
beguenstigter_zahlungspflichtiger: transaction['Beguenstigter/Zahlungspflichtiger'] || null,
verwendungszweck: transaction['Verwendungszweck'] || null
});
if (duplicateCheckResult.recordset[0].count > 0) {
console.log(`Skipping duplicate transaction at row ${i + 1}: ${transaction['Buchungstag']} - ${numericAmount}`);
errors.push({
row: i + 1,
error: 'Duplicate transaction (already exists in database)',
transaction: transaction
});
errorCount++;
continue;
}
const insertQuery = `
INSERT INTO fibdash.CSVTransactions
(buchungstag, wertstellung, umsatzart, betrag, betrag_original, waehrung,
@@ -129,6 +160,7 @@ router.post('/test-csv-import', async (req, res) => {
// Import CSV transactions to database
router.post('/import-csv-transactions', authenticateToken, async (req, res) => {
console.log('Importing CSV transactions');
try {
const { executeQuery } = require('../../config/database');
const { transactions, filename, batchId, headers } = req.body;
@@ -180,6 +212,7 @@ router.post('/import-csv-transactions', authenticateToken, async (req, res) => {
let successCount = 0;
let errorCount = 0;
const errors = [];
console.log('precheck done');
for (let i = 0; i < transactions.length; i++) {
const transaction = transactions[i];
@@ -195,9 +228,6 @@ router.post('/import-csv-transactions', authenticateToken, async (req, res) => {
validationErrors.push('Betrag is required');
}
if (!transaction['Beguenstigter/Zahlungspflichtiger'] || transaction['Beguenstigter/Zahlungspflichtiger'].trim() === '') {
validationErrors.push('Beguenstigter/Zahlungspflichtiger is required');
}
if (validationErrors.length > 2) {
console.log('Skipping invalid row ' + (i + 1) + ':', validationErrors);
@@ -247,6 +277,37 @@ router.post('/import-csv-transactions', authenticateToken, async (req, res) => {
numericAmount = parseFloat(normalizedAmount) || 0;
}
// Check for existing transaction to prevent duplicates
const duplicateCheckQuery = `
SELECT COUNT(*) as count FROM fibdash.CSVTransactions
WHERE buchungstag = @buchungstag
AND wertstellung = @wertstellung
AND umsatzart = @umsatzart
AND betrag = @betrag
AND beguenstigter_zahlungspflichtiger = @beguenstigter_zahlungspflichtiger
AND verwendungszweck = @verwendungszweck
`;
const duplicateCheckResult = await executeQuery(duplicateCheckQuery, {
buchungstag: transaction['Buchungstag'] || null,
wertstellung: transaction['Valutadatum'] || null,
umsatzart: transaction['Buchungstext'] || null,
betrag: numericAmount,
beguenstigter_zahlungspflichtiger: transaction['Beguenstigter/Zahlungspflichtiger'] || null,
verwendungszweck: transaction['Verwendungszweck'] || null
});
if (duplicateCheckResult.recordset[0].count > 0) {
console.log(`Skipping duplicate transaction at row ${i + 1}: ${transaction['Buchungstag']} - ${numericAmount}`);
errors.push({
row: i + 1,
error: 'Duplicate transaction (already exists in database)',
transaction: transaction
});
errorCount++;
continue;
}
const insertQuery = `
INSERT INTO fibdash.CSVTransactions
(buchungstag, wertstellung, umsatzart, betrag, betrag_original, waehrung,
@@ -287,6 +348,8 @@ router.post('/import-csv-transactions', authenticateToken, async (req, res) => {
errorCount++;
}
}
console.log('import done',errors);
res.json({
success: true,
@@ -370,4 +433,203 @@ router.get('/csv-import-batches', authenticateToken, async (req, res) => {
}
});
// Import DATEV Beleglinks to database
router.post('/import-datev-beleglinks', authenticateToken, async (req, res) => {
try {
const { executeQuery } = require('../../config/database');
const { beleglinks, filename, batchId, headers } = req.body;
if (!beleglinks || !Array.isArray(beleglinks)) {
return res.status(400).json({ error: 'Beleglinks array is required' });
}
// Expected DATEV CSV headers from the example
const expectedHeaders = [
'Belegart', 'Geschäftspartner-Name', 'Geschäftspartner-Konto', 'Rechnungsbetrag', 'WKZ',
'Rechnungs-Nr.', 'Interne Re.-Nr.', 'Rechnungsdatum', 'BU', 'Konto', 'Konto-Bezeichnung',
'Ware/Leistung', 'Zahlungszuordnung', 'Kontoumsatzzuordnung', 'Gebucht', 'Festgeschrieben',
'Kopie', 'Eingangsdatum', 'Bezahlt', 'BezahltAm', 'Geschäftspartner-Ort', 'Skonto-Betrag 1',
'Fällig mit Skonto 1', 'Skonto 1 in %', 'Skonto-Betrag 2', 'Fällig mit Skonto 2',
'Skonto 2 in %', 'Fällig ohne Skonto', 'Steuer in %', 'USt-IdNr.', 'Kunden-Nr.',
'KOST 1', 'KOST 2', 'KOST-Menge', 'Kurs', 'Nachricht', 'Freier Text', 'IBAN', 'BIC',
'Bankkonto-Nr.', 'BLZ', 'Notiz', 'Land', 'Personalnummer', 'Nachname', 'Vorname',
'Belegkategorie', 'Bezeichnung', 'Abrechnungsmonat', 'Gültig bis', 'Prüfungsrelevant',
'Ablageort', 'Belegtyp', 'Herkunft', 'Leistungsdatum', 'Buchungstext', 'Beleg-ID',
'Zahlungsbedingung', 'Geheftet', 'Gegenkonto', 'keine Überweisung/Lastschrift erstellen',
'Aufgeteilt', 'Bereitgestellt', 'Freigegeben', 'FreigegebenAm', 'Erweiterte Belegdaten fehlen',
'Periode fehlt', 'Rechnungsdaten beim Import fehlen'
];
if (beleglinks.length === 0) {
return res.status(400).json({ error: 'No beleglink data found' });
}
const importBatchId = batchId || 'datev_import_' + Date.now();
let successCount = 0;
let errorCount = 0;
let updateCount = 0;
let insertCount = 0;
let skippedCount = 0;
const errors = [];
for (let i = 0; i < beleglinks.length; i++) {
const beleglink = beleglinks[i];
try {
// Skip empty rows or rows without Beleg-ID
const belegId = beleglink['Beleg-ID'];
if (!belegId || belegId.trim() === '') {
console.log(`Skipping row ${i + 1}: No Beleg-ID found`);
skippedCount++;
continue;
}
const validationErrors = [];
// Parse amount if available
let numericAmount = null;
if (beleglink['Rechnungsbetrag']) {
const amountStr = beleglink['Rechnungsbetrag'].toString().replace(/[^\d,.-]/g, '');
const normalizedAmount = amountStr.replace(',', '.');
numericAmount = parseFloat(normalizedAmount) || null;
}
// Parse date if available
let parsedDate = null;
if (beleglink['Rechnungsdatum']) {
const dateStr = beleglink['Rechnungsdatum'].trim();
const dateParts = dateStr.split(/[.\/\-]/);
if (dateParts.length === 3) {
const day = parseInt(dateParts[0], 10);
const month = parseInt(dateParts[1], 10) - 1;
let year = parseInt(dateParts[2], 10);
if (year < 100) {
year += (year < 50) ? 2000 : 1900;
}
parsedDate = new Date(year, month, day);
if (isNaN(parsedDate.getTime())) {
parsedDate = null;
}
}
}
// First, check if a record with this datevlink already exists
const checkExistingDatevLink = `
SELECT kUmsatzBeleg FROM eazybusiness.dbo.tUmsatzBeleg WHERE datevlink = @datevlink
`;
const existingDatevLink = await executeQuery(checkExistingDatevLink, { datevlink: belegId });
if (existingDatevLink.recordset.length > 0) {
// Record with this datevlink already exists - skip
console.log(`Datevlink already exists, skipping: ${belegId}`);
skippedCount++;
continue;
}
// Extract key from filename in 'Herkunft' column
// Examples: "Rechnung146.pdf" -> key 146 for tRechnung
// "UmsatzBeleg192.pdf" -> key 192 for tUmsatzBeleg
const herkunft = beleglink['Herkunft'];
if (!herkunft || herkunft.trim() === '') {
console.log(`Skipping row ${i + 1}: No filename in Herkunft column`);
skippedCount++;
continue;
}
// Extract the key from filename patterns
let matchFound = false;
// Pattern: UmsatzBeleg{key}.pdf -> match with tUmsatzBeleg.kUmsatzBeleg
const umsatzBelegMatch = herkunft.match(/UmsatzBeleg(\d+)\.pdf/i);
if (umsatzBelegMatch) {
const kUmsatzBeleg = parseInt(umsatzBelegMatch[1], 10);
const updateQuery = `
UPDATE eazybusiness.dbo.tUmsatzBeleg
SET datevlink = @datevlink
WHERE kUmsatzBeleg = @kUmsatzBeleg AND (datevlink IS NULL OR datevlink = '' OR datevlink = 'pending')
`;
const updateResult = await executeQuery(updateQuery, {
datevlink: belegId,
kUmsatzBeleg: kUmsatzBeleg
});
if (updateResult.rowsAffected && updateResult.rowsAffected[0] > 0) {
updateCount++;
console.log(`Added datevlink ${belegId} to tUmsatzBeleg.kUmsatzBeleg: ${kUmsatzBeleg}`);
matchFound = true;
} else {
console.log(`Skipping row ${i + 1}: UmsatzBeleg ${kUmsatzBeleg} nicht gefunden oder datevlink bereits gesetzt`);
skippedCount++;
}
}
// Pattern: Rechnung{key}.pdf -> match with tPdfObjekt.kPdfObjekt
const rechnungMatch = herkunft.match(/Rechnung(\d+)\.pdf/i);
if (!matchFound && rechnungMatch) {
const kPdfObjekt = parseInt(rechnungMatch[1], 10);
const updateQuery = `
UPDATE eazybusiness.dbo.tPdfObjekt
SET datevlink = @datevlink
WHERE kPdfObjekt = @kPdfObjekt AND (datevlink IS NULL OR datevlink = '' OR datevlink = 'pending')
`;
const updateResult = await executeQuery(updateQuery, {
datevlink: belegId,
kPdfObjekt: kPdfObjekt
});
if (updateResult.rowsAffected && updateResult.rowsAffected[0] > 0) {
updateCount++;
console.log(`Added datevlink ${belegId} to tPdfObjekt.kPdfObjekt: ${kPdfObjekt}`);
matchFound = true;
} else {
console.log(`Skipping row ${i + 1}: PdfObjekt ${kPdfObjekt} nicht gefunden oder datevlink bereits gesetzt`);
skippedCount++;
}
}
if (!matchFound) {
console.log(`Skipping row ${i + 1}: Unbekanntes Dateiformat '${herkunft}' (erwartet: UmsatzBeleg{key}.pdf oder Rechnung{key}.pdf)`);
skippedCount++;
continue;
}
successCount++;
} catch (error) {
console.error('Error processing beleglink ' + (i + 1) + ':', error);
errors.push({
row: i + 1,
error: error.message,
beleglink: beleglink
});
errorCount++;
}
}
res.json({
success: true,
batchId: importBatchId,
imported: updateCount, // Number of datevlinks actually added/updated
processed: successCount,
updated: updateCount,
inserted: insertCount,
skipped: skippedCount, // Records skipped (existing datevlinks)
errors: errorCount, // Only actual errors, not skipped records
details: errors.length > 0 ? errors : undefined,
message: `${updateCount} datevlinks hinzugefügt, ${skippedCount} bereits vorhanden, ${errorCount} Fehler`
});
} catch (error) {
console.error('Error importing DATEV beleglinks:', error);
res.status(500).json({ error: 'Failed to import DATEV beleglinks' });
}
});
module.exports = router;

View File

@@ -39,14 +39,33 @@ const formatDatevAmount = (amount) => {
return Math.abs(amount).toFixed(2).replace('.', ',');
};
const formatDatevDate = (dateString) => {
if (!dateString) return '';
const parts = dateString.split('.');
const formatDatevDate = (date) => {
if (!date) return '';
// Handle Date object
if (date instanceof Date) {
const day = date.getDate().toString().padStart(2, '0');
const month = (date.getMonth() + 1).toString().padStart(2, '0');
return day + month;
}
// Handle string date
const dateStr = date.toString();
const parts = dateStr.split('.');
if (parts.length === 3) {
const day = parts[0].padStart(2, '0');
const month = parts[1].padStart(2, '0');
return day + month;
}
// Try to parse as date string
const parsedDate = new Date(dateStr);
if (!isNaN(parsedDate)) {
const day = parsedDate.getDate().toString().padStart(2, '0');
const month = (parsedDate.getMonth() + 1).toString().padStart(2, '0');
return day + month;
}
return '';
};
@@ -55,13 +74,219 @@ const quote = (str, maxLen = 60) => {
return '"' + str.slice(0, maxLen).replace(/"/g, '""') + '"';
};
// Parse konto field which might contain multiple accounts like "5400+5300"
const parseKonto = (konto) => {
if (!konto) return '';
// Take the first account number if multiple are present
const parts = konto.split('+');
return parts[0].trim();
};
// DATEV export endpoint
router.get('/datev/:timeRange', authenticateToken, async (req, res) => {
try {
const { timeRange } = req.params;
// TODO: Update to use database queries instead of CSV file
res.status(501).json({ error: 'DATEV export temporarily disabled - use database-based queries' });
return;
const { executeQuery } = require('../../config/database');
// Parse the time range to get start and end dates
let startDate, endDate;
if (timeRange.includes('-Q')) {
// Quarter format: 2025-Q1
const [year, quarterPart] = timeRange.split('-Q');
const quarter = parseInt(quarterPart, 10);
const startMonth = (quarter - 1) * 3 + 1;
const endMonth = startMonth + 2;
startDate = new Date(year, startMonth - 1, 1);
endDate = new Date(year, endMonth - 1, new Date(year, endMonth, 0).getDate());
} else if (timeRange.length === 4) {
// Year format: 2025
startDate = new Date(timeRange, 0, 1);
endDate = new Date(timeRange, 11, 31);
} else {
// Month format: 2025-03
const [year, month] = timeRange.split('-');
startDate = new Date(year, parseInt(month) - 1, 1);
endDate = new Date(year, parseInt(month), 0);
}
// Format dates for SQL query
const sqlStartDate = startDate.toISOString().split('T')[0];
const sqlEndDate = endDate.toISOString().split('T')[0];
// Query to get all DATEV data with proper joins
// This handles multiple documents per transaction by creating separate rows
const query = `
WITH DatevDocuments AS (
-- Get documents from tUmsatzBeleg
SELECT
uk.kZahlungsabgleichUmsatz,
zu.fBetrag as umsatz_brutto,
CASE WHEN zu.fBetrag < 0 THEN 'H' ELSE 'S' END as soll_haben_kz,
JSON_VALUE(uk.data, '$.konto1') as konto,
'' as gegenkonto, -- No creditorID in tUmsatzBeleg
-- BU determination based on amount and konto type
CASE
WHEN JSON_VALUE(uk.data, '$.konto1') IN ('3720', '3740', '2100', '1460', '1462') THEN ''
WHEN zu.fBetrag > 0 THEN ''
WHEN JSON_VALUE(uk.data, '$.konto1') LIKE '5%' THEN '9' -- 19% for purchases
WHEN JSON_VALUE(uk.data, '$.konto1') LIKE '6%' THEN '9' -- 19% for expenses
ELSE ''
END as bu,
FORMAT(zu.dBuchungsdatum, 'Mdd') as buchungsdatum_mdd,
zu.dBuchungsdatum,
'' as rechnungsnummer, -- No invoice number in tUmsatzBeleg
zu.cVerwendungszweck as buchungstext,
ub.datevlink as beleglink,
1 as priority -- tUmsatzBeleg has priority
FROM tUmsatzKontierung uk
INNER JOIN tZahlungsabgleichUmsatz zu ON uk.kZahlungsabgleichUmsatz = zu.kZahlungsabgleichUmsatz
INNER JOIN tUmsatzBeleg ub ON ub.kZahlungsabgleichUmsatz = zu.kZahlungsabgleichUmsatz
WHERE ub.datevlink IS NOT NULL
AND zu.dBuchungsdatum >= @startDate
AND zu.dBuchungsdatum <= @endDate
UNION ALL
-- Get documents from tPdfObjekt via tZahlungsabgleichUmsatzLink
SELECT
uk.kZahlungsabgleichUmsatz,
zu.fBetrag as umsatz_brutto,
CASE WHEN zu.fBetrag < 0 THEN 'H' ELSE 'S' END as soll_haben_kz,
JSON_VALUE(uk.data, '$.konto1') as konto,
COALESCE(JSON_VALUE(po.extraction, '$.creditorID'), '') as gegenkonto,
-- BU determination based on amount and konto type
CASE
WHEN JSON_VALUE(uk.data, '$.konto1') IN ('3720', '3740', '2100', '1460', '1462') THEN ''
WHEN zu.fBetrag > 0 THEN ''
WHEN JSON_VALUE(uk.data, '$.konto1') LIKE '5%' THEN '9' -- 19% for purchases
WHEN JSON_VALUE(uk.data, '$.konto1') LIKE '6%' THEN '9' -- 19% for expenses
ELSE ''
END as bu,
FORMAT(zu.dBuchungsdatum, 'Mdd') as buchungsdatum_mdd,
zu.dBuchungsdatum,
COALESCE(JSON_VALUE(po.extraction, '$.invoice_number'), '') as rechnungsnummer,
zu.cVerwendungszweck as buchungstext,
po.datevlink as beleglink,
2 as priority -- tPdfObjekt has lower priority
FROM tUmsatzKontierung uk
INNER JOIN tZahlungsabgleichUmsatz zu ON uk.kZahlungsabgleichUmsatz = zu.kZahlungsabgleichUmsatz
INNER JOIN tZahlungsabgleichUmsatzLink zul ON zu.kZahlungsabgleichUmsatz = zul.kZahlungsabgleichUmsatz
AND zul.linktype = 'kLieferantenBestellung'
INNER JOIN tPdfObjekt po ON zul.linktarget = po.kLieferantenbestellung
WHERE po.datevlink IS NOT NULL
AND zu.dBuchungsdatum >= @startDate
AND zu.dBuchungsdatum <= @endDate
UNION ALL
-- Get transactions without documents
SELECT
uk.kZahlungsabgleichUmsatz,
zu.fBetrag as umsatz_brutto,
CASE WHEN zu.fBetrag < 0 THEN 'H' ELSE 'S' END as soll_haben_kz,
JSON_VALUE(uk.data, '$.konto1') as konto,
'' as gegenkonto,
-- BU determination based on amount and konto type
CASE
WHEN JSON_VALUE(uk.data, '$.konto1') IN ('3720', '3740', '2100', '1460', '1462') THEN ''
WHEN zu.fBetrag > 0 THEN ''
WHEN JSON_VALUE(uk.data, '$.konto1') LIKE '5%' THEN '9' -- 19% for purchases
WHEN JSON_VALUE(uk.data, '$.konto1') LIKE '6%' THEN '9' -- 19% for expenses
ELSE ''
END as bu,
FORMAT(zu.dBuchungsdatum, 'Mdd') as buchungsdatum_mdd,
zu.dBuchungsdatum,
'' as rechnungsnummer,
zu.cVerwendungszweck as buchungstext,
'' as beleglink,
3 as priority -- No documents has lowest priority
FROM tUmsatzKontierung uk
INNER JOIN tZahlungsabgleichUmsatz zu ON uk.kZahlungsabgleichUmsatz = zu.kZahlungsabgleichUmsatz
WHERE zu.dBuchungsdatum >= @startDate
AND zu.dBuchungsdatum <= @endDate
AND NOT EXISTS (
SELECT 1 FROM tUmsatzBeleg ub2
WHERE ub2.kZahlungsabgleichUmsatz = zu.kZahlungsabgleichUmsatz
AND ub2.datevlink IS NOT NULL
)
AND NOT EXISTS (
SELECT 1 FROM tZahlungsabgleichUmsatzLink zul2
INNER JOIN tPdfObjekt po2 ON zul2.linktarget = po2.kLieferantenbestellung
WHERE zul2.kZahlungsabgleichUmsatz = zu.kZahlungsabgleichUmsatz
AND zul2.linktype = 'kLieferantenBestellung'
AND po2.datevlink IS NOT NULL
)
)
SELECT
*,
ROW_NUMBER() OVER (PARTITION BY kZahlungsabgleichUmsatz, beleglink ORDER BY priority) as rn
FROM DatevDocuments
ORDER BY dBuchungsdatum DESC, kZahlungsabgleichUmsatz, priority
`;
const result = await executeQuery(query, {
startDate: sqlStartDate,
endDate: sqlEndDate
});
// Format data for DATEV CSV
const datevRows = [];
// Build header
const periodStart = startDate.getFullYear() +
('0' + (startDate.getMonth() + 1)).slice(-2) +
('0' + startDate.getDate()).slice(-2);
const periodEnd = endDate.getFullYear() +
('0' + (endDate.getMonth() + 1)).slice(-2) +
('0' + endDate.getDate()).slice(-2);
datevRows.push(buildDatevHeader(periodStart, periodEnd));
datevRows.push(DATEV_COLS);
// Process each transaction
result.recordset.forEach(row => {
// Skip duplicate rows (keep only the first occurrence of each transaction+beleglink combination)
if (row.rn > 1) return;
const datevRow = [
formatDatevAmount(row.umsatz_brutto), // Umsatz (ohne Soll/Haben-Kz)
row.soll_haben_kz, // Soll/Haben-Kennzeichen
'', // WKZ Umsatz
'', // Kurs
'', // Basis-Umsatz
'', // WKZ Basis-Umsatz
parseKonto(row.konto), // Konto (parsed)
row.gegenkonto || '', // Gegenkonto (ohne BU-Schlüssel)
row.bu || '', // BU-Schlüssel
row.buchungsdatum_mdd || '', // Belegdatum (MDD format)
quote(row.rechnungsnummer || ''), // Belegfeld 1 (invoice number)
'', // Belegfeld 2
'', // Skonto
quote(row.buchungstext || ''), // Buchungstext
'', // Postensperre
'', // Diverse Adressnummer
'', // Geschäftspartnerbank
'', // Sachverhalt
'', // Zinssperre
row.beleglink || '' // Beleglink
].join(';');
datevRows.push(datevRow);
});
// Generate CSV content
const csvContent = datevRows.join('\n');
// Set headers for CSV download
const filename = `EXTF_${timeRange.replace('-', '_')}.csv`;
res.setHeader('Content-Type', 'text/csv; charset=windows-1252');
res.setHeader('Content-Disposition', `attachment; filename="${filename}"`);
// Send CSV content
res.send(csvContent);
} catch (error) {
console.error('Error generating DATEV export:', error);
res.status(500).json({ error: 'Failed to generate DATEV export' });

View File

@@ -0,0 +1,416 @@
const express = require('express');
const { authenticateToken } = require('../../middleware/auth');
const { executeQuery, executeTransaction } = require('../../config/database');
const sql = require('mssql');
const nodemailer = require('nodemailer');
const router = express.Router();
// Get document processing status
router.get('/document-status', authenticateToken, async (req, res) => {
try {
console.log('Document status endpoint called');
const queries = {
needMarkdownUmsatz: "SELECT COUNT(*) as count FROM tUmsatzBeleg WHERE markDown is null",
needMarkdownPdf: "SELECT COUNT(*) as count FROM tPdfObjekt WHERE markDown is null",
needExtractionUmsatz: "SELECT COUNT(*) as count FROM tUmsatzBeleg WHERE markDown is not null and extraction is null",
needExtractionPdf: "SELECT COUNT(*) as count FROM tPdfObjekt WHERE markDown is not null and extraction is null",
needDatevSyncUmsatz: "SELECT COUNT(*) as count FROM tUmsatzBeleg WHERE markDown is not null and datevlink is null",
needDatevSyncPdf: "SELECT COUNT(*) as count FROM tPdfObjekt WHERE markDown is not null and datevlink is null",
needDatevUploadUmsatz: "SELECT COUNT(*) as count FROM tUmsatzBeleg WHERE datevlink = 'pending'",
needDatevUploadPdf: "SELECT COUNT(*) as count FROM tPdfObjekt WHERE datevlink = 'pending'"
};
const results = {};
for (const [key, query] of Object.entries(queries)) {
const result = await executeQuery(query);
results[key] = result.recordset[0].count;
}
const status = {
needMarkdown: results.needMarkdownUmsatz + results.needMarkdownPdf,
needExtraction: results.needExtractionUmsatz + results.needExtractionPdf,
needDatevSync: results.needDatevSyncUmsatz + results.needDatevSyncPdf,
needDatevUpload: results.needDatevUploadUmsatz + results.needDatevUploadPdf,
details: {
markdown: {
umsatzBeleg: results.needMarkdownUmsatz,
pdfObjekt: results.needMarkdownPdf
},
extraction: {
umsatzBeleg: results.needExtractionUmsatz,
pdfObjekt: results.needExtractionPdf
},
datevSync: {
umsatzBeleg: results.needDatevSyncUmsatz,
pdfObjekt: results.needDatevSyncPdf
},
datevUpload: {
umsatzBeleg: results.needDatevUploadUmsatz,
pdfObjekt: results.needDatevUploadPdf
}
}
};
console.log('Document status computed:', status);
res.json(status);
} catch (error) {
console.error('Error fetching document processing status:', error);
res.status(500).json({ error: 'Failed to fetch document processing status' });
}
});
// Process markdown conversion
router.post('/process-markdown', authenticateToken, async (req, res) => {
try {
const { OpenAI } = require('openai');
// Check environment for OpenAI API key
if (!process.env.OPENAI_API_KEY) {
return res.status(500).json({ error: 'OpenAI API key not configured' });
}
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
await executeTransaction(async (transaction) => {
// Process UmsatzBeleg documents
const umsatzResult = await new sql.Request(transaction).query(
"SELECT TOP 1 kUmsatzBeleg, content FROM tUmsatzBeleg WHERE markDown is null"
);
if (umsatzResult.recordset.length > 0) {
const { kUmsatzBeleg, content } = umsatzResult.recordset[0];
const response = await openai.responses.create({
model: "gpt-4o",
input: [
{ "role": "developer", "content": [{ "type": "input_text", "text": "Convert to Markdown" }] },
{ "role": "user", "content": [{ "type": "input_file", "filename": "invoice.pdf", "file_data": "data:application/pdf;base64," + content.toString('base64') }] }
],
text: {
"format": {
"type": "json_schema", "name": "markdown", "strict": true, "schema": { "type": "object", "properties": {
"output": { "type": "string", "description": "Input converted to Markdown" }
}, "required": ["output"], "additionalProperties": false }
}
},
tools: [],
store: false
});
const markdown = JSON.parse(response.output_text);
await new sql.Request(transaction)
.input('kUmsatzBeleg', kUmsatzBeleg)
.input('markDown', markdown.output)
.query("UPDATE tUmsatzBeleg SET markDown = @markDown WHERE kUmsatzBeleg = @kUmsatzBeleg");
}
// Process PdfObjekt documents
const pdfResult = await new sql.Request(transaction).query(
"SELECT TOP 1 kPdfObjekt, content FROM tPdfObjekt WHERE markDown is null"
);
if (pdfResult.recordset.length > 0) {
const { kPdfObjekt, content } = pdfResult.recordset[0];
const response = await openai.responses.create({
model: "gpt-4o",
input: [
{ "role": "developer", "content": [{ "type": "input_text", "text": "Convert to Markdown" }] },
{ "role": "user", "content": [{ "type": "input_file", "filename": "invoice.pdf", "file_data": "data:application/pdf;base64," + content.toString('base64') }] }
],
text: {
"format": {
"type": "json_schema", "name": "markdown", "strict": true, "schema": { "type": "object", "properties": {
"output": { "type": "string", "description": "Input converted to Markdown" }
}, "required": ["output"], "additionalProperties": false }
}
},
tools: [],
store: false
});
const markdown = JSON.parse(response.output_text);
await new sql.Request(transaction)
.input('kPdfObjekt', kPdfObjekt)
.input('markDown', markdown.output)
.query("UPDATE tPdfObjekt SET markDown = @markDown WHERE kPdfObjekt = @kPdfObjekt");
}
});
res.json({ success: true, message: 'Markdown processing completed' });
} catch (error) {
console.error('Error processing markdown:', error);
res.status(500).json({ error: 'Failed to process markdown: ' + error.message });
}
});
// Process data extraction
router.post('/process-extraction', authenticateToken, async (req, res) => {
try {
const { OpenAI } = require('openai');
if (!process.env.OPENAI_API_KEY) {
return res.status(500).json({ error: 'OpenAI API key not configured' });
}
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
await executeTransaction(async (transaction) => {
// Get creditor IDs for extraction
const creditorResult = await new sql.Request(transaction).query(
"SELECT kreditorId FROM fibdash.Kreditor ORDER BY kreditorId"
);
const creditorIDs = creditorResult.recordset.map(r => r.kreditorId).join(', ');
// Process UmsatzBeleg documents
const umsatzResult = await new sql.Request(transaction).query(
"SELECT TOP 1 kUmsatzBeleg, markDown FROM tUmsatzBeleg WHERE markDown is not null and extraction is null"
);
if (umsatzResult.recordset.length > 0) {
const { kUmsatzBeleg, markDown } = umsatzResult.recordset[0];
const response = await openai.responses.create({
model: "gpt-5-mini",
input: [
{ "role": "developer", "content": [{ "type": "input_text", "text": `Extract specified information from provided input and structure it in a JSON format.
The aim is to accurately identify and capture the following elements:
- Rechnungsdatum/Belegdatum (Invoice Date/Document Date),
- Rechnungsnummer/Belegnummer (Invoice Number/Document Number),
- Netto Betrag (Net Amount),
- Brutto Betrag (Gross Amount),
- and Absender (Sender).
# Steps
1. **Identify Dates**: Find and extract the invoice or document date (Rechnungsdatum/Belegdatum) from the input text.
2. **Extract Numbers**: Locate and pull out the invoice or document number (Rechnungsnummer/Belegnummer).
3. **Determine Amounts**: Identify the net amount (Netto Betrag) and the gross amount (Brutto Betrag) and the currency in the text.
4. **Source the Sender**: Extract the sender's information (Absender, Country).
5. **Structure Data**: Organize the extracted information into a JSON format following the specified schema.
# Notes
- Ensure that dates are formatted consistently.
- Be mindful of various numerical representations (e.g., with commas or periods).
- The sender's information might include company names, so recognize various formats.
- Prioritize accuracy in identifying the correct fields, as there can be similar text elements present.
Also select the CreditorID, from that List: ${creditorIDs}` }] },
{ "role": "user", "content": [{ "type": "input_text", "text": markDown }] }
],
text: {
"format": {
"type": "json_schema", "name": "invoice", "strict": true, "schema": { "type": "object", "properties": {
"date": { "type": "string", "description": "Rechungsdatum / Belegdatum in ISO 8601" },
"invoice_number": { "type": "string", "description": "Rechnungsnummer / Belegnummer / Invoicenr" },
"net_amounts_and_tax": {
"type": "array", "description": "Liste von Nettobeträgen mit jeweiligem Steuersatz und Steuerbetrag, ein Listeneintrag pro Steuersatz",
"items": { "type": "object", "properties": {
"net_amount": { "type": "number", "description": "Netto Betrag" },
"tax_rate": { "type": "number", "description": "Steuersatz in Prozent" },
"tax_amount": { "type": "number", "description": "Steuerbetrag" }
}, "required": ["net_amount", "tax_rate", "tax_amount"], "additionalProperties": false }
},
"gross_amount": { "type": "number", "description": "Brutto Betrag (muss der Summe aller net_amount + tax_amount entsprechen)" },
"currency": { "type": "string", "description": "currency code in ISO 4217" },
"country": { "type": "string", "description": "country of origin in ISO 3166" },
"sender": { "type": "string", "description": "Absender" },
"creditorID": { "type": "string", "description": "CreditorID or empty if unknown" }
}, "required": ["date", "invoice_number", "net_amounts_and_tax", "gross_amount", "currency", "country", "sender", "creditorID"], "additionalProperties": false }
}
},
tools: [],
store: false
});
const extraction = JSON.parse(response.output_text);
await new sql.Request(transaction)
.input('kUmsatzBeleg', kUmsatzBeleg)
.input('extraction', JSON.stringify(extraction))
.query("UPDATE tUmsatzBeleg SET extraction = @extraction WHERE kUmsatzBeleg = @kUmsatzBeleg");
}
// Process PdfObjekt documents
const pdfResult = await new sql.Request(transaction).query(
"SELECT TOP 1 kPdfObjekt, markDown FROM tPdfObjekt WHERE markDown is not null and extraction is null"
);
if (pdfResult.recordset.length > 0) {
const { kPdfObjekt, markDown } = pdfResult.recordset[0];
const response = await openai.responses.create({
model: "gpt-5-mini",
input: [
{ "role": "developer", "content": [{ "type": "input_text", "text": `Extract specified information from provided input and structure it in a JSON format.
The aim is to accurately identify and capture the following elements:
- Rechnungsdatum/Belegdatum (Invoice Date/Document Date),
- Rechnungsnummer/Belegnummer (Invoice Number/Document Number),
- Netto Betrag (Net Amount),
- Brutto Betrag (Gross Amount),
- and Absender (Sender).
# Steps
1. **Identify Dates**: Find and extract the invoice or document date (Rechnungsdatum/Belegdatum) from the input text.
2. **Extract Numbers**: Locate and pull out the invoice or document number (Rechnungsnummer/Belegnummer).
3. **Determine Amounts**: Identify the net amount (Netto Betrag) and the gross amount (Brutto Betrag) and the currency in the text.
4. **Source the Sender**: Extract the sender's information (Absender, Country).
5. **Structure Data**: Organize the extracted information into a JSON format following the specified schema.
# Notes
- Ensure that dates are formatted consistently.
- Be mindful of various numerical representations (e.g., with commas or periods).
- The sender's information might include company names, so recognize various formats.
- Prioritize accuracy in identifying the correct fields, as there can be similar text elements present.
Also select the CreditorID, from that List: ${creditorIDs}` }] },
{ "role": "user", "content": [{ "type": "input_text", "text": markDown }] }
],
text: {
"format": {
"type": "json_schema", "name": "invoice", "strict": true, "schema": { "type": "object", "properties": {
"date": { "type": "string", "description": "Rechungsdatum / Belegdatum in ISO 8601" },
"invoice_number": { "type": "string", "description": "Rechnungsnummer / Belegnummer / Invoicenr" },
"net_amounts_and_tax": {
"type": "array", "description": "Liste von Nettobeträgen mit jeweiligem Steuersatz und Steuerbetrag, ein Listeneintrag pro Steuersatz",
"items": { "type": "object", "properties": {
"net_amount": { "type": "number", "description": "Netto Betrag" },
"tax_rate": { "type": "number", "description": "Steuersatz in Prozent" },
"tax_amount": { "type": "number", "description": "Steuerbetrag" }
}, "required": ["net_amount", "tax_rate", "tax_amount"], "additionalProperties": false }
},
"gross_amount": { "type": "number", "description": "Brutto Betrag (muss der Summe aller net_amount + tax_amount entsprechen)" },
"currency": { "type": "string", "description": "currency code in ISO 4217" },
"country": { "type": "string", "description": "country of origin in ISO 3166" },
"sender": { "type": "string", "description": "Absender" },
"creditorID": { "type": "string", "description": "CreditorID or empty if unknown" }
}, "required": ["date", "invoice_number", "net_amounts_and_tax", "gross_amount", "currency", "country", "sender", "creditorID"], "additionalProperties": false }
}
},
tools: [],
store: false
});
const extraction = JSON.parse(response.output_text);
await new sql.Request(transaction)
.input('kPdfObjekt', kPdfObjekt)
.input('extraction', JSON.stringify(extraction))
.query("UPDATE tPdfObjekt SET extraction = @extraction WHERE kPdfObjekt = @kPdfObjekt");
}
});
res.json({ success: true, message: 'Extraction processing completed' });
} catch (error) {
console.error('Error processing extraction:', error);
res.status(500).json({ error: 'Failed to process extraction: ' + error.message });
}
});
// Process Datev sync
router.post('/process-datev-sync', authenticateToken, async (req, res) => {
try {
const transporter = nodemailer.createTransport({
host: "smtp.gmail.com",
port: 587,
secure: false, // true for 465, false for other ports
auth: {
user: "sebgreenbus@gmail.com",
pass: "abrp idub thbi kdws", // For Gmail, you might need an app-specific password
},
});
await executeTransaction(async (transaction) => {
// Process UmsatzBeleg documents
const umsatzResult = await new sql.Request(transaction).query(
"SELECT TOP 1 kUmsatzBeleg, content FROM tUmsatzBeleg WHERE markDown is not null and datevlink is null"
);
if (umsatzResult.recordset.length > 0) {
const { kUmsatzBeleg, content } = umsatzResult.recordset[0];
const mailOptions = {
from: '"Growheads" <sebgreenbus@gmail.com>',
to: "97bfd9eb-770f-481a-accb-e69649d36a9e@uploadmail.datev.de",
subject: `Beleg ${kUmsatzBeleg} für Datev`,
text: "", // No body text as requested
attachments: [
{
filename: `UmsatzBeleg${kUmsatzBeleg}.pdf`,
content: content,
contentType: "application/pdf",
},
],
};
try {
let info = await transporter.sendMail(mailOptions);
console.log("Message sent: %s", info.messageId);
await new sql.Request(transaction)
.input('kUmsatzBeleg', kUmsatzBeleg)
.input('datevlink', 'pending')
.query("UPDATE tUmsatzBeleg SET datevlink = @datevlink WHERE kUmsatzBeleg = @kUmsatzBeleg");
} catch (emailError) {
console.error("Error sending email:", emailError);
throw emailError;
}
}
// Process PdfObjekt documents
const pdfResult = await new sql.Request(transaction).query(
"SELECT TOP 1 kPdfObjekt, content FROM tPdfObjekt WHERE markDown is not null and datevlink is null"
);
if (pdfResult.recordset.length > 0) {
const { kPdfObjekt, content } = pdfResult.recordset[0];
const mailOptions = {
from: '"Growheads" <sebgreenbus@gmail.com>',
to: "97bfd9eb-770f-481a-accb-e69649d36a9e@uploadmail.datev.de",
subject: `Rechnung ${kPdfObjekt} für Datev`,
text: "", // No body text as requested
attachments: [
{
filename: `Rechnung${kPdfObjekt}.pdf`,
content: content,
contentType: "application/pdf",
},
],
};
try {
let info = await transporter.sendMail(mailOptions);
console.log("Message sent: %s", info.messageId);
await new sql.Request(transaction)
.input('kPdfObjekt', kPdfObjekt)
.input('datevlink', 'pending')
.query("UPDATE tPdfObjekt SET datevlink = @datevlink WHERE kPdfObjekt = @kPdfObjekt");
} catch (emailError) {
console.error("Error sending email:", emailError);
throw emailError;
}
}
});
res.json({ success: true, message: 'Datev sync processing completed' });
} catch (error) {
console.error('Error processing Datev sync:', error);
res.status(500).json({ error: 'Failed to process Datev sync: ' + error.message });
}
});
module.exports = router;

View File

@@ -6,7 +6,9 @@ const datev = require('./datev');
const pdf = require('./pdf');
const kreditors = require('./kreditors');
const bankingTransactions = require('./bankingTransactions');
const accountingItems = require('./accountingItems');
const csvImport = require('./csvImport');
const documentProcessing = require('./documentProcessing');
const router = express.Router();
@@ -17,6 +19,8 @@ router.use(datev);
router.use(pdf);
router.use(kreditors);
router.use(bankingTransactions);
router.use(accountingItems);
router.use(csvImport);
router.use(documentProcessing);
module.exports = router;

View File

@@ -10,6 +10,25 @@ router.get('/transactions/:timeRange', authenticateToken, async (req, res) => {
const { timeRange } = req.params;
const { executeQuery } = require('../../config/database');
// Build WHERE clause based on timeRange format
let timeWhereClause = '';
if (timeRange.includes('-Q')) {
// Quarter format: 2025-Q2
const [year, quarterPart] = timeRange.split('-Q');
const quarter = parseInt(quarterPart, 10);
const startMonth = (quarter - 1) * 3 + 1;
const endMonth = startMonth + 2;
timeWhereClause = `WHERE YEAR(csv.parsed_date) = ${year} AND MONTH(csv.parsed_date) BETWEEN ${startMonth} AND ${endMonth}`;
} else if (timeRange.length === 4) {
// Year format: 2025
timeWhereClause = `WHERE YEAR(csv.parsed_date) = ${timeRange}`;
} else {
// Month format: 2025-07
const [year, month] = timeRange.split('-');
timeWhereClause = `WHERE YEAR(csv.parsed_date) = ${year} AND MONTH(csv.parsed_date) = ${parseInt(month, 10)}`;
}
const query = `
SELECT
csv.id as id,
@@ -47,6 +66,7 @@ router.get('/transactions/:timeRange', authenticateToken, async (req, res) => {
LEFT JOIN fibdash.Kreditor k ON csv.kontonummer_iban = k.iban
LEFT JOIN fibdash.BankingAccountTransactions bat ON csv.id = bat.csv_transaction_id
LEFT JOIN fibdash.Kreditor ak ON bat.assigned_kreditor_id = ak.id
${timeWhereClause}
UNION ALL
@@ -84,6 +104,12 @@ router.get('/transactions/:timeRange', authenticateToken, async (req, res) => {
WHERE ABS(csv.numeric_amount - jtl.fBetrag) < 0.01
AND ABS(DATEDIFF(day, csv.parsed_date, jtl.dBuchungsdatum)) <= 1
)
${timeRange.includes('-Q') ?
`AND YEAR(jtl.dBuchungsdatum) = ${timeRange.split('-Q')[0]} AND MONTH(jtl.dBuchungsdatum) BETWEEN ${(parseInt(timeRange.split('-Q')[1], 10) - 1) * 3 + 1} AND ${(parseInt(timeRange.split('-Q')[1], 10) - 1) * 3 + 3}` :
timeRange.length === 4 ?
`AND YEAR(jtl.dBuchungsdatum) = ${timeRange}` :
`AND YEAR(jtl.dBuchungsdatum) = ${timeRange.split('-')[0]} AND MONTH(jtl.dBuchungsdatum) = ${parseInt(timeRange.split('-')[1], 10)}`
}
ORDER BY parsed_date DESC
`;
@@ -108,7 +134,41 @@ router.get('/transactions/:timeRange', authenticateToken, async (req, res) => {
const linksResult = await executeQuery(linksQuery);
const linksData = linksResult.recordset || [];
const transactions = result.recordset.map(transaction => ({
// Group transactions by ID to handle multiple JTL matches
const transactionGroups = {};
result.recordset.forEach(row => {
const key = row.id;
if (!transactionGroups[key]) {
transactionGroups[key] = {
...row,
pdfs: [],
links: []
};
// Remove top-level kUmsatzBeleg and datevlink since they belong in pdfs array
delete transactionGroups[key].kUmsatzBeleg;
delete transactionGroups[key].datevlink;
delete transactionGroups[key].jtl_document_data;
}
// Add PDF data if present
if (row.jtl_document_data) {
transactionGroups[key].pdfs.push({
content: row.jtl_document_data,
kUmsatzBeleg: row.kUmsatzBeleg,
datevlink: row.datevlink
});
}
// Add links data if present
if (row.jtlId) {
const transactionLinks = linksData.filter(link =>
link.kZahlungsabgleichUmsatz === row.jtlId
);
transactionGroups[key].links.push(...transactionLinks);
}
});
const transactions = Object.values(transactionGroups).map(transaction => ({
...transaction,
parsedDate: new Date(transaction.parsed_date),
hasJTL: Boolean(transaction.hasJTL),
@@ -125,223 +185,36 @@ router.get('/transactions/:timeRange', authenticateToken, async (req, res) => {
id: transaction.assigned_kreditor_id,
kreditorId: transaction.assigned_kreditor_kreditorId
} : null,
pdfs: transaction.jtl_document_data ? [{
content: transaction.jtl_document_data,
kUmsatzBeleg: transaction.kUmsatzBeleg,
datevlink: transaction.datevlink
}] : [],
links: transaction.jtlId ? linksData.filter(link =>
link.kZahlungsabgleichUmsatz === transaction.jtlId
) : []
// Remove duplicate links
links: [...new Set(transaction.links.map(l => JSON.stringify(l)))].map(l => JSON.parse(l))
}));
let filteredTransactions = [];
if (timeRange.includes('-Q')) {
const [year, quarterPart] = timeRange.split('-Q');
const quarter = parseInt(quarterPart, 10);
const startMonth = (quarter - 1) * 3 + 1;
const endMonth = startMonth + 2;
filteredTransactions = transactions.filter(t => {
if (!t.monthYear) return false;
const [tYear, tMonth] = t.monthYear.split('-');
const monthNum = parseInt(tMonth, 10);
return tYear === year && monthNum >= startMonth && monthNum <= endMonth;
});
} else if (timeRange.length === 4) {
filteredTransactions = transactions.filter(t => {
if (!t.monthYear) return false;
const [tYear] = t.monthYear.split('-');
return tYear === timeRange;
});
} else {
filteredTransactions = transactions.filter(t => t.monthYear === timeRange);
}
const monthTransactions = filteredTransactions
// Transactions are already filtered by the SQL query, so we just need to sort them
const monthTransactions = transactions
.sort((a, b) => b.parsedDate - a.parsedDate);
// Get JTL transactions for comparison
let jtlTransactions = [];
let jtlDatabaseAvailable = false;
try {
jtlTransactions = await getJTLTransactions();
jtlDatabaseAvailable = true;
console.log('DEBUG: JTL database connected, found', jtlTransactions.length, 'transactions');
} catch (error) {
console.log('JTL database not available, continuing without JTL data:', error.message);
jtlDatabaseAvailable = false;
}
// Filter JTL transactions for the selected time period
let jtlMonthTransactions = [];
if (timeRange.includes('-Q')) {
const [year, quarterPart] = timeRange.split('-Q');
const quarter = parseInt(quarterPart, 10);
const startMonth = (quarter - 1) * 3 + 1;
const endMonth = startMonth + 2;
jtlMonthTransactions = jtlTransactions.filter(jtl => {
const jtlDate = new Date(jtl.dBuchungsdatum);
const jtlMonth = jtlDate.getMonth() + 1;
return jtlDate.getFullYear() === parseInt(year, 10) &&
jtlMonth >= startMonth && jtlMonth <= endMonth;
});
} else if (timeRange.length === 4) {
jtlMonthTransactions = jtlTransactions.filter(jtl => {
const jtlDate = new Date(jtl.dBuchungsdatum);
return jtlDate.getFullYear() === parseInt(timeRange, 10);
});
} else {
const [year, month] = timeRange.split('-');
jtlMonthTransactions = jtlTransactions.filter(jtl => {
const jtlDate = new Date(jtl.dBuchungsdatum);
return jtlDate.getFullYear() === parseInt(year, 10) &&
jtlDate.getMonth() === parseInt(month, 10) - 1;
});
}
// Get Kreditor information for IBAN lookup
let kreditorData = [];
try {
const kreditorQuery = `SELECT id, iban, name, kreditorId, is_banking FROM fibdash.Kreditor`;
const kreditorResult = await executeQuery(kreditorQuery);
kreditorData = kreditorResult.recordset || [];
} catch (error) {
console.log('Kreditor database not available, continuing without Kreditor data');
}
// Add JTL status and Kreditor information to each CSV transaction
const transactionsWithJTL = monthTransactions.map((transaction, index) => {
const amount = transaction.numericAmount;
const transactionDate = transaction.parsedDate;
if (index === 0) {
console.log('DEBUG First CSV transaction:', {
amount: amount,
transactionDate: transactionDate,
jtlMonthTransactionsCount: jtlMonthTransactions.length
});
if (jtlMonthTransactions.length > 0) {
console.log('DEBUG First JTL transaction:', {
amount: parseFloat(jtlMonthTransactions[0].fBetrag),
date: new Date(jtlMonthTransactions[0].dBuchungsdatum)
});
}
}
const jtlMatch = jtlMonthTransactions.find(jtl => {
const jtlAmount = parseFloat(jtl.fBetrag) || 0;
const jtlDate = new Date(jtl.dBuchungsdatum);
const amountMatch = Math.abs(amount - jtlAmount) < 0.01;
const dateMatch = transactionDate && jtlDate &&
transactionDate.getFullYear() === jtlDate.getFullYear() &&
transactionDate.getMonth() === jtlDate.getMonth() &&
transactionDate.getDate() === jtlDate.getDate();
if (index === 0 && (amountMatch || dateMatch)) {
console.log('DEBUG Potential match for first transaction:', {
csvAmount: amount,
jtlAmount: jtlAmount,
amountMatch: amountMatch,
csvDate: transactionDate,
jtlDate: jtlDate,
dateMatch: dateMatch,
bothMatch: amountMatch && dateMatch
});
}
return amountMatch && dateMatch;
});
const transactionIban = transaction['Kontonummer/IBAN'];
const kreditorMatch = transactionIban ? kreditorData.find(k => k.iban === transactionIban) : null;
return {
...transaction,
hasJTL: jtlDatabaseAvailable ? !!jtlMatch : undefined,
jtlId: jtlMatch ? jtlMatch.kZahlungsabgleichUmsatz : null,
isFromCSV: true,
jtlDatabaseAvailable,
pdfs: jtlMatch ? jtlMatch.pdfs || [] : [],
links: jtlMatch ? jtlMatch.links || [] : [],
kreditor: kreditorMatch ? {
id: kreditorMatch.id,
name: kreditorMatch.name,
kreditorId: kreditorMatch.kreditorId,
iban: kreditorMatch.iban,
is_banking: Boolean(kreditorMatch.is_banking)
} : null,
hasKreditor: !!kreditorMatch
};
});
const unmatchedJTLTransactions = jtlMonthTransactions
.filter(jtl => {
const jtlAmount = parseFloat(jtl.fBetrag) || 0;
const jtlDate = new Date(jtl.dBuchungsdatum);
const hasCSVMatch = monthTransactions.some(transaction => {
const amount = transaction.numericAmount;
const transactionDate = transaction.parsedDate;
const amountMatch = Math.abs(amount - jtlAmount) < 0.01;
const dateMatch = transactionDate && jtlDate &&
transactionDate.getFullYear() === jtlDate.getFullYear() &&
transactionDate.getMonth() === jtlDate.getMonth() &&
transactionDate.getDate() === jtlDate.getDate();
return amountMatch && dateMatch;
});
return !hasCSVMatch;
})
.map(jtl => ({
'Buchungstag': new Date(jtl.dBuchungsdatum).toLocaleDateString('de-DE', {
day: '2-digit',
month: '2-digit',
year: '2-digit'
}),
'Verwendungszweck': jtl.cVerwendungszweck || '',
'Buchungstext': 'JTL Transaction',
'Beguenstigter/Zahlungspflichtiger': jtl.cName || '',
'Kontonummer/IBAN': '',
'Betrag': jtl.fBetrag ? jtl.fBetrag.toString().replace('.', ',') : '0,00',
numericAmount: parseFloat(jtl.fBetrag) || 0,
parsedDate: new Date(jtl.dBuchungsdatum),
monthYear: timeRange,
hasJTL: true,
jtlId: jtl.kZahlungsabgleichUmsatz,
isFromCSV: false,
isJTLOnly: true,
pdfs: jtl.pdfs || [],
links: jtl.links || [],
kreditor: null,
hasKreditor: false
}));
// Since transactions are already filtered and joined with JTL data in SQL,
// we don't need the complex post-processing logic anymore
const summary = {
totalTransactions: filteredTransactions.length,
totalIncome: filteredTransactions
totalTransactions: transactions.length,
totalIncome: transactions
.filter(t => t.numericAmount > 0)
.reduce((sum, t) => sum + t.numericAmount, 0),
totalExpenses: filteredTransactions
totalExpenses: transactions
.filter(t => t.numericAmount < 0)
.reduce((sum, t) => sum + Math.abs(t.numericAmount), 0),
netAmount: filteredTransactions.reduce((sum, t) => sum + t.numericAmount, 0),
netAmount: transactions.reduce((sum, t) => sum + t.numericAmount, 0),
timeRange: timeRange,
jtlDatabaseAvailable: true,
jtlMatches: filteredTransactions.filter(t => t.hasJTL === true && t.isFromCSV).length,
jtlMissing: filteredTransactions.filter(t => t.hasJTL === false && t.isFromCSV).length,
jtlOnly: filteredTransactions.filter(t => t.isJTLOnly === true).length,
csvOnly: filteredTransactions.filter(t => t.hasJTL === false && t.isFromCSV).length
jtlMatches: transactions.filter(t => t.hasJTL === true && t.isFromCSV).length,
jtlMissing: transactions.filter(t => t.hasJTL === false && t.isFromCSV).length,
jtlOnly: transactions.filter(t => t.isJTLOnly === true).length,
csvOnly: transactions.filter(t => t.hasJTL === false && t.isFromCSV).length
};
res.json({
transactions: filteredTransactions,
transactions: transactions,
summary
});
} catch (error) {

View File

@@ -44,7 +44,7 @@ module.exports = {
new HtmlWebpackPlugin({
template: './client/public/index.html',
templateParameters: {
REACT_APP_GOOGLE_CLIENT_ID: process.env.GOOGLE_CLIENT_ID || 'your_google_client_id_here',
REACT_APP_GOOGLE_CLIENT_ID: process.env.GOOGLE_CLIENT_ID,
},
}),
new webpack.DefinePlugin({
@@ -75,7 +75,7 @@ module.exports = {
},
proxy: {
'/api': {
target: 'http://localhost:5000',
target: 'http://localhost:5500',
changeOrigin: true,
},
},

View File

@@ -85,7 +85,7 @@ module.exports = {
},
proxy: {
'/api': {
target: 'http://localhost:5000',
target: 'http://localhost:5500',
changeOrigin: true,
},
},
@@ -94,4 +94,4 @@ module.exports = {
maxAssetSize: 512000,
maxEntrypointSize: 512000,
},
};
};