KFUPM Logo

FraudX – Receipt Forgery Detection

KFUPM – ICS 619 | Rayan Alsubhi

New analysis Dashboard Queue 5
Demo: VAT QR — clean Phase-1 invoice · expected QR check ✓ verified
← Analyse another receipt Open queue
?
Verdict — Medium risk

Caveats on VAT QR

No visual tampering detected, but VAT QR cross-check raised caveats.

Patch CNN Low risk forgery confidence 1% VAT QR Caveats 1 QR
1%
CNN forgery confidence
No decision recorded.
Export audit PDF
⚠ Duplicate: exact match of a file already analysed in this session.

Saudi VAT QR check ! caveats

ZATCA
Phase-1
QR
Seller
Al-Othaim Markets
VAT registration #
300012345600003
Timestamp
2024-08-15T14:32:00Z
Invoice total
334.72 SAR
VAT amount
43.66 SAR
Format 7/7 Cross-check vs receipt 1/3 · 2 caveats
  • ? seller name on receipt'Al-Othaim Markets' not found in OCR'd receipt text (token overlap 0%). OCR may have missed the header.
  • ? VAT amount matches receiptQR VAT 43.66 SAR not directly found in OCR — receipts often round or summarise VAT differently. Manual review suggested.
Show all 10 checks
Format compliance
  • payload decodesBase64 + TLV well-formed; 5 tag(s) found.
  • all mandatory tags presentTags 1 (seller), 2 (VAT#), 3 (timestamp), 4 (total), 5 (VAT) all present.
  • VAT number format300012345600003 — 15 digits, starts and ends with 3.
  • timestamp valid2024-08-15T14:32:00Z (parses as ISO 8601).
  • amounts numericTotal 334.72 SAR · VAT 43.66 SAR.
  • VAT ≤ totalVAT amount does not exceed invoice total.
  • VAT rate ≈ 15%Implied rate 15.00% (±2pp tolerance from 15%).
Cross-check vs receipt
  • ? seller name on receipt'Al-Othaim Markets' not found in OCR'd receipt text (token overlap 0%). OCR may have missed the header.
  • invoice total matches receiptQR total 334.72 SAR appears in the receipt (closest OCR value 334.72).
  • ? VAT amount matches receiptQR VAT 43.66 SAR not directly found in OCR — receipts often round or summarise VAT differently. Manual review suggested.

Key findings

Document — original & heatmap
Original
Original Document 🔍 Click to zoom
Suspicion heatmap
Base Document CNN Heatmap 🔍 Click to zoom

Suspicious regions

The top 3 highest-confidence regions out of all 3 that exceeded the model's threshold. Hover or click any thumbnail to highlight that 128 × 128 region on the document on the left.

1
Suspicious region 1 thumbnail
at (311, 64) 69%
CUT Company
2
Suspicious region 2 thumbnail
at (311, 128) 59%
CUT Product
3
Suspicious region 3 thumbnail
at (311, 0) 46%
CUT Company
Region breakdown table (3 rows)
# Coords Score Edit type Field affected
1 (311, 64) 69% CUT 91% Company 33% (low conf)
2 (311, 128) 59% CUT 90% Product 39% (low conf)
3 (311, 0) 46% CUT 93% Company 56%
Methodology, audit metadata & technical details
File: X00016469619_zatca.png SHA-256: 6c13be62d872… Size: 439 × 1004 px Analysed: 2026-05-11 23:29:14 Model: FraudX v2-multi · ResNet-18 · ep13 · thr=0.08
Patch precision
92.25% vs paper OH-JPEG 79.41%
Patch F1 / AUC
91.79 / 0.97
Image-level F1
29.66 vs paper ChatGPT-relaxed 28.39
Patches scored (this image)
90 · 81 text-bearing
Regions ≥ threshold
3 · thr 0.08
Top-region score
0.009

Approach. ResNet-18 patch classifier (128×128, stride 64) trained on FINDIT2 (Tornes et al., ICDAR 2023). Two auxiliary heads classify the modification technique and the document field; the binary backbone is frozen so the headline patch precision (92.25%) is preserved bit-for-bit while adding explainability. Image-level fraud score is the top-k mean of patch probabilities over text-bearing regions only (edge density ≥ 0.02) — keeps blank regions from poisoning the score.

Class accuracies on the FINDIT2 test set.

  • Edit type: CPI 73% · CUT 100% · IMI 38% · PIX 55% · Other 14%
  • Field: Total/payment 67% · Metadata 47% · Product 27% · Company 24%

Original technical findings (raw).

  • Patch CNN flagged 3 of 81 text-rich patches (top score 0.69).
  • Top 3 suspicious regions clustered around image coordinates (311,64), (311,128), (311,0).
  • Predicted modification mix across top regions: 3× CUT.
  • Predicted entity types: 2× Company, 1× Product.
  • Document is an EXACT duplicate of a file already analysed in this session (resubmission).