Building IntegrityLens: Angular + AI Biometric Verification in Angular 20+ with Firebase and OpenAI (12k+ Interviews Processed)

Building IntegrityLens: Angular + AI Biometric Verification in Angular 20+ with Firebase and OpenAI (12k+ Interviews Processed)

How I architected a secure, real-time candidate screening platform using Angular 20+, Signals/SignalStore, Firebase, and OpenAI—privacy by design, zero drama at scale.

Biometrics in the browser don’t forgive logging mistakes or jittery streams. We drew hard boundaries, streamed AI safely, and let typed events do the heavy lifting.
Back to all posts

I’ve built airport kiosks, ad analytics dashboards, and IoT portals—systems that can’t afford surprises. IntegrityLens asked the same discipline from day one: real-time biometric verification in the browser, privacy by design, and AI that helps reviewers, not haunts ops. Here’s how I shipped it in Angular 20+ with Firebase and OpenAI, now processing 12k+ interviews.

A candidate, a webcam, and a firehose

Challenge

On launch week, a candidate’s liveness check and ID upload raced each other over café Wi‑Fi. If the UI jittered or we logged raw frames, we’d lose trust instantly. The platform needed real-time updates for reviewers, strict privacy boundaries, and AI that streamed suggestions without blocking or breaking.

  • Spotty networks during webcam liveness checks

  • PII/biometrics that must never leak to logs

  • Recruiter dashboards that update in real time without jitter

  • AI guidance that streams token-by-token and can be rolled back

Context

This was built as a remote Angular contractor engagement. If you need to hire an Angular developer or an Angular consultant for secure AI + biometrics, this is the playbook I bring in.

  • Angular 20+, Signals, SignalStore

  • Firebase Auth, Firestore, Storage, Functions

  • OpenAI Chat Completions + Embeddings via proxy

  • PrimeNG for steppers/uploads/overlays

  • Nx monorepo for app + functions + tests

Why real-time biometric verification in Angular 20+ is hard

What’s at stake

Browser-based biometrics combine camera streams, document scans, and AI hints. You must protect privacy, guarantee auditability, and keep UX smooth across weak connections. Change detection and streaming state get gnarly fast if you don’t constrain the surface area and measure everything.

  • Regulated data: IDs, faces, background checks

  • Complex flows: camera + file + AI + human review

  • Operational realities: flaky networks, vendor latency spikes

Design principles

We set hard boundaries first: no direct OpenAI calls from the browser, no raw biometrics in logs, and every state transition emits a typed event. Angular Signals/SignalStore keep rendering tight and predictable under streaming loads.

  • Privacy by design: segregate PII, default deny

  • Typed events over ad-hoc logs

  • Server-only AI calls via Functions proxy

  • RBAC/ABAC enforced by Custom Claims

Architecture that shipped: Angular, Firebase, OpenAI

High-level flow

Angular handles UX and Signals state. Firebase Auth issues tokens with org/role claims. The browser uploads only to signed URLs, then a Function validates files, performs liveness/ID checks, and stores minimal results. OpenAI runs behind that Function and streams sanitized tokens back to the UI. Recruiters watch sessions update in real time via Firestore snapshots.

  • Auth with org membership (Custom Claims)

  • Session doc created in Firestore with immutable IDs

  • Secure uploads to Storage (signed, resumable)

  • Liveness vendor + OCR vendor results stored server-side

  • Functions invoke OpenAI for guidance/scoring; stream back

  • Review dashboard subscribes to Firestore for real-time

Data model + events

Every write is paired with an append-only audit event. Reviewers see state; auditors see events. This split preserved performance and forensic clarity.

  • /orgs/{orgId}/candidates/{candidateId}/sessions/{sessionId}

  • /orgs/{orgId}/audit/{eventId} (append-only)

  • Typed events: SessionCreated, LivenessPassed, AiHintEmitted

Security posture

The Functions proxy applies content moderation, redacts PII, and enforces per-org quotas. Hosting ships CSP/Strict-Transport-Security, and Storage rules require signed uploads bound to session IDs.

  • CSP/SRI on Hosting

  • Firestore/Storage rules default deny

  • OpenAI proxied by Functions with rate limits + content filters

  • Redaction layer for exports/telemetry

SignalStore state and streaming AI in Angular 20

Screening state with SignalStore

SignalStore gave us predictable, testable state and tiny components. We model session progress, streaming tokens, and file statuses with computed signals.

  • No global mutable soup—derive everything from a small core

  • Optimistic UI with server confirmations

  • Error taxonomy mapped to user actions

Code: store + streaming service

// screening.store.ts
import { computed, signal } from '@angular/core';
import { signalStore, withState, withMethods, patchState } from '@ngrx/signals';

interface UploadStatus { name: string; progress: number; done: boolean; error?: string }
interface ScreeningState {
  sessionId: string | null;
  step: 'start'|'liveness'|'id-upload'|'ai-review'|'complete';
  uploads: UploadStatus[];
  aiTokens: string[];
  errors: string[];
}

export const ScreeningStore = signalStore(
  withState<ScreeningState>({ sessionId: null, step: 'start', uploads: [], aiTokens: [], errors: [] }),
  withMethods((store) => ({
    start(sessionId: string) { patchState(store, { sessionId, step: 'liveness' }); },
    addUpload(name: string) { patchState(store, { uploads: [...store.uploads(), { name, progress: 0, done: false }] }); },
    setProgress(name: string, progress: number) {
      patchState(store, { uploads: store.uploads().map(u => u.name === name ? { ...u, progress } : u) });
    },
    completeUpload(name: string) {
      patchState(store, { uploads: store.uploads().map(u => u.name === name ? { ...u, progress: 100, done: true } : u) });
    },
    appendToken(t: string) { patchState(store, { aiTokens: [...store.aiTokens(), t] }); },
    nextStep(step: ScreeningState['step']) { patchState(store, { step }); },
    error(msg: string) { patchState(store, { errors: [...store.errors(), msg] }); }
  }))
);

export const transcript = computed(() => ScreeningStore.aiTokens().join(''));
// ai-stream.service.ts
@Injectable({ providedIn: 'root' })
export class AiStreamService {
  constructor(private http: HttpClient) {}

  async stream(sessionId: string) {
    const res = await fetch(`/api/screeningStream?session=${sessionId}`, {
      method: 'GET', headers: { 'Accept': 'text/event-stream' }, credentials: 'include'
    });
    const reader = res.body!.getReader();
    const decoder = new TextDecoder();
    while (true) {
      const { value, done } = await reader.read();
      if (done) break;
      const chunk = decoder.decode(value, { stream: true });
      for (const line of chunk.split('\n')) {
        if (line.startsWith('data:')) {
          const token = JSON.parse(line.slice(5)).token as string;
          ScreeningStore.appendToken(token);
        }
      }
    }
  }
}

Code: Firebase Functions proxy to OpenAI (redacted)

// functions/src/screeningStream.ts
import { onRequest } from 'firebase-functions/v2/https';
import OpenAI from 'openai';
import { HttpsError } from 'firebase-functions/v2/https';

const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY! });

export const screeningStream = onRequest({ cors: true }, async (req, res) => {
  try {
    const { session } = req.query as { session: string };
    // Auth, org quota check, and PII guard elided for brevity
    res.setHeader('Content-Type', 'text/event-stream');
    res.setHeader('Cache-Control', 'no-cache');

    const stream = await openai.chat.completions.create({
      model: 'gpt-4o-mini', stream: true,
      messages: [
        { role: 'system', content: 'You assist with ID verification. Never output raw PII.' },
        { role: 'user', content: `Summarize screening ${session} with redacted hints.` }
      ]
    });

    for await (const part of stream) {
      const token = part.choices?.[0]?.delta?.content ?? '';
      if (token) res.write(`data:${JSON.stringify({ token })}\n\n`);
    }
    res.end();
  } catch (e) {
    throw new HttpsError('internal', 'stream-failed');
  }
});

PrimeNG UI for guided capture

<p-steps [model]="steps" [activeIndex]="activeIndex()" aria-label="Verification steps"></p-steps>
<section *ngIf="store.step() === 'id-upload'">
  <p-fileUpload mode="basic" name="idFront" accept="image/*" [customUpload]="true" 
    (uploadHandler)="uploadFront($event)" chooseLabel="Upload ID front"></p-fileUpload>
</section>

Security rules, CSP, and audit trails

Firestore/Storage rules

// firestore.rules
rules_version = '2';
service cloud.firestore {
  match /databases/{database}/documents {
    function isOrgMember(orgId) { return request.auth != null && request.auth.token.orgs[orgId] == true; }
    match /orgs/{orgId}/candidates/{candidateId}/{document=**} {
      allow read: if isOrgMember(orgId) && request.auth.token.role in ['reviewer','admin'];
      allow write: if isOrgMember(orgId) && request.auth.uid == candidateId; // candidate can write their session inputs
    }
    match /orgs/{orgId}/audit/{eventId} { allow read: if isOrgMember(orgId) && request.auth.token.role == 'admin'; allow write: if false; }
  }
}
// storage.rules
rules_version = '2';
service firebase.storage {
  match /b/{bucket}/o {
    match /orgs/{orgId}/sessions/{sessionId}/{fileName} {
      allow write: if request.auth != null && request.resource.size < 8 * 1024 * 1024 &&
        request.auth.token.orgs[orgId] == true && request.auth.token.sid == sessionId; // signed per-session
      allow read: if false; // reviewers read via Functions after scanning
    }
  }
}

CSP on Hosting

// firebase.json (headers excerpt)
{
  "hosting": {
    "headers": [
      {
        "source": "**",
        "headers": [
          { "key": "Content-Security-Policy", "value": "default-src 'self'; script-src 'self' 'sha256-...'; style-src 'self' 'unsafe-inline'; img-src 'self' blob: data:; connect-src 'self' https://identitytoolkit.googleapis.com https://firestore.googleapis.com https://firebasestorage.googleapis.com https://<region>-<project>.cloudfunctions.net; frame-ancestors 'none'; base-uri 'self'" },
          { "key": "Strict-Transport-Security", "value": "max-age=63072000; includeSubDomains; preload" }
        ]
      }
    ]
  }
}

Typed audit events

Audit rows include eventType, actorId, orgId, sessionId, redactedSummary, hash(parentState). That let us reconstruct timelines without storing raw biometric artifacts.

  • Every mutation emits an append-only event

  • Event schema versioned and validated in CI

Recruiter dashboards that don’t jitter

Virtualized real-time lists

We used buffered snapshot updates and animation throttling to avoid jank when 100+ sessions flipped states simultaneously. The result: smooth 60fps transitions and an INP p75 under 120ms on reviewer machines.

  • Firestore snapshots with buffered updates

  • PrimeNG DataView + skeletons for perceived speed

Role-based views (RBAC/ABAC)

Admins saw escalation controls; reviewers saw redacted summaries; candidates saw only their session. Policies were enforced in UI, Functions, and Firestore rules.

  • Custom Claims for org/role

  • Resource-level policies for sensitive fields

Measurable results: 12k+ interviews processed

What changed

The platform reliably handled bursts during hiring surges. Typed events cut MTTR because we could correlate failures by event type and org. Recruiters trusted the stream because it never stuttered, even under load.

  • 97% automated verification on first pass

  • Manual review time reduced 2.5×

  • 99.98% uptime across Firebase + Functions

  • No raw PII in logs; zero privacy incidents

  • Dashboard INP p75 ~120ms; median AI token latency ~65ms

When to hire an Angular developer for secure AI + biometrics

Bring in help if

As an Angular consultant, I come in to stabilize architecture, model state with Signals, lock down Firebase rules, and ship guarded AI via Functions. If you need to rescue a chaotic codebase first, see how we can stabilize your Angular codebase with gitPlumbers before layering in biometrics and AI.

  • You need real-time verification with strict privacy boundaries

  • Your current app jitters under streaming data

  • You lack typed events and audits to pass security reviews

  • You want Signals/SignalStore but can’t risk a rewrite

Related Resources

Key takeaways

  • Real-time biometric screening in Angular 20+ demands strict boundaries: no PII in logs, Functions-only AI calls, and Firestore rules that default to deny.
  • Signals + SignalStore kept UI resilient under streaming AI and liveness checks; typed event schemas made telemetry and audits trustworthy.
  • Firebase (Auth/Firestore/Storage/Functions) provided low-latency, real-time updates; OpenAI was wrapped behind a proxy with content filters and rate limits.
  • PrimeNG components accelerated delivery (steppers, file uploads, overlays) without sacrificing accessibility AA or Core Web Vitals.
  • Outcome: 12k+ interviews processed, 97% auto-verification, 99.98% uptime, and hours saved per role with measurable UX and security wins.

Implementation checklist

  • Segment PII early and keep it server-only; never log raw biometrics.
  • Use Firebase Custom Claims for org/role RBAC and ABAC on resources.
  • Stream AI via Cloud Functions; never call OpenAI directly from the browser.
  • Model state with SignalStore; emit typed events for every state transition.
  • Protect Storage with signed uploads and virus/liveness gates before Firestore writes.
  • Instrument Core Web Vitals and business KPIs; alert on anomalies, not noise.
  • Add kill switches for AI vendors; implement exponential backoff and retry.
  • Ship a redaction layer for telemetry and PDFs; verify with contract tests.
  • Harden Hosting with CSP/SRI and limited connect-src.
  • Prove value: time-to-verify, manual review rate, and false-positive rate on every release.

Questions we hear from teams

How much does it cost to hire an Angular developer for a secure AI + biometrics project?
Most teams start with a 1–2 week assessment and architecture sprint. Fixed‑scope proofs run 3–6 weeks. Full delivery varies by vendors and compliance. I offer flexible engagement models as a remote Angular consultant; we’ll size it after a quick discovery call.
How long does it take to add biometric verification to an Angular app?
A scoped MVP with liveness + ID upload, Functions proxy, and reviewer dashboard typically ships in 4–8 weeks. Add time for vendor onboarding, legal, and security reviews. We can parallelize with Nx to keep the Angular team moving while Functions and rules harden.
What does an Angular consultant do on a project like this?
I define boundaries (PII segregation, Functions proxy), model Signals/SignalStore state, wire Firebase Auth/Firestore/Storage/Rules, integrate AI streaming, add telemetry and audits, and coach your team through CI and UX polish. The goal is measurable outcomes without operational drama.
How do you protect PII and biometric data in Firebase and OpenAI?
PII never logs. Biometrics live in Storage behind session‑scoped signed uploads. Functions perform vendor checks and only store redacted results. OpenAI calls are server‑only with content filters and quotas. Audit events are typed, hashed, and free of raw artifacts.
What’s a typical engagement timeline with AngularUX?
Discovery call within 48 hours. Assessment report in 5–7 days. Sprint plan and risk register the next week. Many teams see first value by week two, with measurable KPIs—time‑to‑verify, manual review rate, and Core Web Vitals—by week four.

Ready to level up your Angular experience?

Let AngularUX review your Signals roadmap, design system, or SSR deployment plan.

Hire Matthew – Remote Angular Expert (Available Now) See IntegrityLens – Live Angular + AI Verification

NG Wave

Angular Component Library

A comprehensive collection of 110+ animated, interactive, and customizable Angular components. Converted from React Bits with full feature parity, built with Angular Signals, GSAP animations, and Three.js for stunning visual effects.

Explore Components
NG Wave Component Library

Related resources