
Data Virtualization for 50k+ Rows in Angular 20+: PrimeNG/Material Tables, Smooth Scroll, and Memory Budgets That Hold
The UX patterns and performance guardrails I use to keep enterprise tables fast, accessible, and beautiful when datasets explode.
Virtualization isn’t a trick—it’s the contract between UX polish and performance budgets.Back to all posts
If your dashboard jitters when a telecom feed bursts or an audit table breaches 50k rows, you’ve met the limits of naive *ngFor. I’ve been there—airport kiosks, employee tracking, and real-time ad analytics. The fix isn’t a bigger machine; it’s a deliberate UX+engineering contract: data virtualization plus design tokens that keep the DOM lean and readable.
This article shows how I ship smooth tables in Angular 20+ with PrimeNG and Angular Material, backed by Signals + SignalStore, and measured by hard budgets. We’ll keep accessibility, typography, and density aligned with the AngularUX palette—because polish and performance aren’t enemies.
When your table hits 50k rows, virtualization is your only friend
A real scene from production
At a leading telecom, our real-time analytics dashboard went from jittery (16–24 FPS) to smooth (60 FPS) by moving from infinite scroll to true virtualization. Same machine, same data, different rendering strategy. Similar gains showed up in an airport kiosk project—offline-tolerant lists with barcode scans no longer stuttered, even on locked-down hardware.
120k telemetry events/hour
Agents scrolling at 60 FPS
Memory capped under 350 MB
Why pagination was failing operators
Pagination looks safe but forces mental context switches. Operators chase rows across pages, filters reload, and DOM churn is still high under frequent updates. Virtualization preserves context: one scrollable surface with stable selection and minimal nodes.
High cognitive load
Lost context between pages
Wasted time on search + refine
Why data virtualization beats infinite scroll and dumb pagination
Performance math you can’t ignore
Rendering 50k rows is 50k DOM nodes—no framework wins that fight. Virtualization constrains nodes to what’s visible plus a buffer, keeping layout and paint predictable. Measured impact: telecom dashboard INP improved from 220 ms to 98 ms; memory dropped from ~1.2 GB to ~320 MB.
DOM nodes are expensive
Reflow/paint costs dominate
Heap growth kills INP
Operator UX and accessibility
Virtualization lets us keep selection and focus stable while streaming updates. We also retain AA color contrast, readable type at dense configurations, and screen reader cues that don’t spam users.
Preserve selection & keyboard flow
Announce counts and updates
Sticky headers with correct roles
Angular 20+ data virtualization patterns with Signals + SignalStore
Virtualization is a collaboration between state (Signals), rendering (CDK/PrimeNG), and design tokens (typography/density/colors). Keep the contract tight and measured.
SignalStore: keep only what you need in view
I use SignalStore (NgRx Signals) to isolate the viewport slice and avoid recomputing the full table on every tick. A ring buffer caps growth under real-time feeds.
Rows array + filters + viewport
Derived visibleRows() signal
Ring buffer for streams
Store skeleton
import { Injectable, computed, signal } from '@angular/core';
import { SignalStore, withState, patchState } from '@ngrx/signals';
export interface Row { id: string; ts: number; name: string; status: 'ok'|'warn'|'err'; }
interface TableState {
rows: Row[]; // raw buffer (bounded)
filter: string; // quick client filter
viewport: { start: number; end: number }; // indices
rowHeight: number; // from density token
maxRows: number; // ring buffer cap
}
@Injectable({ providedIn: 'root' })
export class TableStore extends SignalStore<TableState>(
withState({ rows: [], filter: '', viewport: { start: 0, end: 100 }, rowHeight: 40, maxRows: 50000 })
) {
readonly filtered = computed(() => {
const f = this.state().filter?.toLowerCase();
if (!f) return this.state().rows;
return this.state().rows.filter(r => r.name.toLowerCase().includes(f));
});
readonly visibleRows = computed(() => {
const { start, end } = this.state().viewport;
return this.filtered().slice(start, end);
});
updateViewport(start: number, end: number) {
patchState(this, { viewport: { start, end } });
}
pushRows(incoming: Row[]) {
const { rows, maxRows } = this.state();
const merged = rows.concat(incoming);
// ring buffer: trim head when over cap
const trimmed = merged.length > maxRows ? merged.slice(merged.length - maxRows) : merged;
patchState(this, { rows: trimmed });
}
}Angular Material + CDK Virtual Scroll
<cdk-virtual-scroll-viewport
[itemSize]="store.state().rowHeight"
[minBufferPx]="800"
[maxBufferPx]="1600"
class="table-viewport">
<table mat-table [dataSource]="store.visibleRows()" class="mat-elevation-z2" role="grid" aria-rowcount="{{store.filtered().length}}">
<!-- Define columns -->
<ng-container matColumnDef="name">
<th mat-header-cell *matHeaderCellDef scope="col">Name</th>
<td mat-cell *matCellDef="let r">{{ r.name }}</td>
</ng-container>
<tr mat-header-row *matHeaderRowDef="['name']" sticky></tr>
<tr mat-row *cdkVirtualFor="let row of store.visibleRows(); trackBy: trackById" [matRowDefColumns]="['name']"></tr>
</table>
</cdk-virtual-scroll-viewport>trackById = (_: number, r: Row) => r.id;Note: MatTable doesn’t natively virtualize header/footers; we wrap rows with cdkVirtualFor and keep sticky headers. Keep row templates tight—no expensive pipes per cell.
Lock itemSize to density token
Use cdkVirtualFor with trackBy
Minimize templates per row
PrimeNG p-table virtual scroll
<p-table
[value]="store.filtered()"
[virtualScroll]="true"
[rows]="100"
[scrollHeight]="'600px'"
[virtualScrollItemSize]="store.state().rowHeight"
[lazy]="true"
(onLazyLoad)="onLazyLoad($event)"
[loading]="loading()"
[trackBy]="trackById"
styleClass="ux-table">
<ng-template pTemplate="header">
<tr><th>Name</th></tr>
</ng-template>
<ng-template pTemplate="body" let-row>
<tr><td>{{ row.name }}</td></tr>
</ng-template>
</p-table>onLazyLoad(evt: { first: number; rows: number }) {
this.store.updateViewport(evt.first, evt.first + evt.rows);
}PrimeNG’s virtual mode requests slices; we simply map those indices to the store’s filtered buffer.
Enable [virtualScroll] and [lazy]
Set [virtualScrollItemSize]
Return slices via onLazyLoad
Micro-optimizations that move the needle
- trackBy prevents full row re-renders.
- Don’t create new closures/objects in templates.
- Batch WebSocket events (e.g., 16 ms windows) before pushRows to avoid change storms.
trackBy always
Avoid unbounded async pipes
Coalesce stream updates
Memory discipline: pooling and cleanup
// Lightweight view model pool
interface RowVM { id: string; name: string; statusClass: string; }
const vmPool: RowVM[] = [];
function asVM(r: Row): RowVM {
const vm = vmPool.pop() || { id: '', name: '', statusClass: '' };
vm.id = r.id; vm.name = r.name; vm.statusClass = r.status === 'err' ? 't-danger' : r.status === 'warn' ? 't-warn' : 't-ok';
return vm;
}
function releaseVM(vm: RowVM) { vmPool.push(vm); }Pool when you map high-frequency updates to view models. Release when rows fall out of the viewport to keep GC predictable.
Pool view models when mapping
Prefer readonly primitives
Avoid detached orphan arrays
A11y that survives virtualization
<div class="sr-only" aria-live="polite" aria-atomic="true">
Showing {{ store.visibleRows().length }} of {{ store.filtered().length }} rows
</div>- Use role="grid" and scope="col" on headers.
- Keep focus in-row while scrolling; avoid re-creating focused nodes.
- Sticky headers must remain tabbable.
Announce total rows
Respect keyboard focus
Don’t spam live regions
Typography, density, and the AngularUX palette
:root {
/* AngularUX palette */
--ux-primary-600: #2563eb; // Blue 600
--ux-surface: #0b1020; // Dark surface
--ux-text-strong: #eef2ff; // Near-white
--ux-warn-600: #f59e0b;
--ux-danger-600: #ef4444;
/* Type scale */
--ux-font-sm: 12.5px;
--ux-font-md: 14px;
--ux-font-lg: 16px;
/* Density tokens map to virtualization itemSize */
--ux-row-sm: 32px;
--ux-row-md: 40px;
--ux-row-lg: 48px;
}
.table-viewport { background: var(--ux-surface); color: var(--ux-text-strong); }
.table--dense { font-size: var(--ux-font-sm); }
.table--dense .row { height: var(--ux-row-sm); }
.table--comfortable { font-size: var(--ux-font-md); }
.table--comfortable .row { height: var(--ux-row-md); }Match itemSize to the chosen density token to keep scroll math accurate. Validate AA contrast at each density with Pa11y/axe in CI.
Lock itemSize to density token
Maintain AA contrast at all densities
Scale hit targets for touch
Field notes: telecom analytics and airline kiosks
Telecom analytics dashboard
// Typed event schema + batched ingestion
import { toSignal } from '@angular/core/rxjs-interop';
import { webSocket } from 'rxjs/webSocket';
import { bufferTime, filter, map } from 'rxjs/operators';
type TelemetryEvt = { id: string; ts: number; name: string; status: 'ok'|'warn'|'err' };
const socket$ = webSocket<TelemetryEvt>('wss://telemetry.example/ws');
const batched$ = socket$.pipe(bufferTime(16), filter(batch => batch.length > 0));
constructor(private store: TableStore) {
toSignal(batched$.pipe(map(batch => this.store.pushRows(batch))), { initialValue: undefined });
}Results: 60 FPS sustained on commodity laptops, INP < 100 ms, heap stable near 320 MB while rendering 50k-window views. Charts (Highcharts) read from the same store; D3 tooltips updated without dropped frames.
Typed WebSockets
Batched updates (16 ms)
Ring buffer at 50k
Airport kiosk lists (offline-first)
When connectivity dropped, queues still scrolled smoothly using local buffers and virtualized lists. We simulated card readers and barcode scanners with Docker to stress-test input spikes. Virtualization kept memory predictable, and density tokens gave agents a compact-yet-readable view.
Docker hardware simulation
Peripheral APIs (scanner, printers)
Low-power devices
Design polish that doesn’t blow the budget: accessibility, typography, density, and color
Polish and performance aren’t in conflict when tokens and budgets guide the build. Storybook can snapshot dense/comfortable table states; Chromatic catches visual drift.
Readable at a glance
Use the AngularUX palette for status clarity: danger-600 for errors, warn-600 for warnings, primary-600 for actionable cells. Pair color with icons for color-blind support. Keep borders light; rely on spacing and type to reduce visual noise.
Hierarchy via weight/size—not borders
Status color + icon shape
Hover/selected contrast
Touch and keyboard parity
Even in dense mode, action cells hit 44px min for touch. Focus rings remain visible on dark surfaces. Headers keep a logical tab order without jumping when rows virtualize.
44px touch target min
Focus rings visible
Sticky header tab order
Measured rigor in CI
# lighthouse-ci.yaml
ci:
collect:
numberOfRuns: 2
assert:
assertions:
categories:performance: [error, {minScore: 0.9}]
categories:accessibility: [error, {minScore: 0.95}]
max-memory-heap: [warn, {value: 380000000}] # ~362 MBWe run this via Nx + GitHub Actions and push previews to Firebase. If memory or a11y dips, the PR fails.
Lighthouse CI budgets
axe/Pa11y checks
Firebase preview gating
When to Hire an Angular Developer for Legacy Rescue
If you’re evaluating whether to hire an Angular developer or contractor, I’m available for focused virtualization, upgrade, and rescue engagements.
Signals you need help now
If your team is staring at flame charts or fighting change detection storms, bring in an Angular consultant who has shipped real virtualized dashboards. I’ve rescued AngularJS→Angular migrations, refactored zone-heavy code to Signals, and stabilized codebases without feature freezes.
Scroll jank > 24 FPS
Heap exceeds 500 MB on large tables
Zone-related change storms
MatTable + pipes choking under load
What the engagement looks like
We inventory data flows, set budgets, and build a vertical slice: SignalStore + virtual scroll + a11y + tokens. CI guardrails in Nx, Firebase previews, and canary deploys keep risk low. See how we “stabilize your Angular codebase” with gitPlumbers.
1-week assessment
2–4 week implementation
Zero-downtime rollout
Key takeaways and what to instrument next
- Use virtualization for 10k+ rows; tie itemSize to density tokens.
- Signals + SignalStore compute minimal slices; ring buffers cap memory.
- trackBy, pooled VMs, and batched events reduce churn.
- Validate AA contrast, keyboard flow, and live announcements.
- Instrument performance and a11y in CI; block regressions before merge.
If you need a senior Angular engineer to implement this in your app, let’s talk. I can review your tables, propose a measured plan, and ship without drama.
Key takeaways
- Virtualization beats pagination for operator speed and perceived performance; aim for 60 FPS and stable memory under 400 MB for 50k+ row tables.
- Use Signals + SignalStore to compute a minimal viewport slice; pair with CDK Virtual Scroll or PrimeNG’s virtual scroll.
- Lock row height (density token) to the virtual itemSize; align typography to keep measurement accurate and accessible.
- Use trackBy, object pooling, and ring buffers to avoid heap growth under real-time updates.
- Instrument scroll FPS, memory, and input latency with Angular DevTools, Lighthouse CI, and Firebase logs; ship budgets in CI.
Implementation checklist
- Decide table mode: paginated, infinite scroll, or virtualized (prefer virtualized for 10k+ rows).
- Set fixed row height and map to design tokens (density-sm/md/lg).
- Implement SignalStore: rows, filters, viewport window, derived visibleRows().
- Integrate CDK Virtual Scroll or PrimeNG virtual scroll with trackBy.
- Pool row view models; cap arrays with a ring buffer when streaming.
- Add a11y: aria-rowcount, keyboard focus guards, and live region updates.
- Add telemetry: FPS, memory, and INP; enforce budgets in CI with Lighthouse.
- Test on low-end devices and high-DPI monitors; verify contrast and legibility.
Questions we hear from teams
- How much does it cost to hire an Angular developer for virtualization work?
- Most engagements land between 2–6 weeks. A focused assessment + vertical slice starts at two weeks; full-table refactors with CI guardrails typically run 4–6 weeks. Fixed-fee options available after a brief code review.
- How long does an Angular 20+ virtualization implementation take?
- A minimal vertical slice (Signals + virtual scroll + a11y + tokens) is 1–2 weeks. Rolling the pattern across multiple tables with telemetry and CI budgets is 3–6 weeks, depending on complexity and upstream APIs.
- What does an Angular consultant do on a project like this?
- I map hot paths with Angular DevTools, set memory/FPS budgets, implement SignalStore, wire CDK/PrimeNG virtualization, enforce a11y and typography tokens, and add Nx + CI checks with Firebase previews. Then I train your team to own it.
- Can we keep our current design system and still virtualize?
- Yes. We align density and typography tokens to the virtual itemSize, adjust contrast to AA, and preserve your brand colors. Storybook/Chromatic snapshots stop drift without slowing delivery.
- Will virtualization help real-time charts too?
- Indirectly. The same SignalStore and batching patterns feed Highcharts/D3 without starving the main thread. You’ll see fewer dropped frames and more stable tooltips under load.
Ready to level up your Angular experience?
Let AngularUX review your Signals roadmap, design system, or SSR deployment plan.
NG Wave
Angular Component Library
A comprehensive collection of 110+ animated, interactive, and customizable Angular components. Converted from React Bits with full feature parity, built with Angular Signals, GSAP animations, and Three.js for stunning visual effects.
Explore Components