Skip to content

OysteinAmundsen/home

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Home

This repository is a proof of concept playground for building a full-stack application with Angular and NestJS. It demonstrates setting up Angular SSR with a NestJS-powered backend, integrating a database, configuring a reverse proxy, and generating Swagger API documentation. The frontend is a fully installable PWA with a custom service worker that supports continuous builds during development. Perfect for exploring modern web development techniques and best practices.

Features

This is a personal feature playground. It is not intended for production use.

NX Graph Screenshot

How to Build and Run

This project uses bun for dependency management and development.

bun install
bun start

This installs dependencies and starts the development server. You can also start the backend and frontend separately:

# in one shell:
bun run dash-api
# and in another shell
bun run dash

Using npm Instead

If you encounter issues with bun or prefer npm, you can try the following:

rm -rf bun.lock
npm install --force
sed -i.bak -e 's/bun x/npx/g' -e 's/bun /npm /g' package.json

This removes the bun lockfile, installs dependencies with npm, and replaces bun commands in package.json with their npm equivalents.

Troubleshooting

  • Launching VSCode chrome debugger prints errors in the console: VSCode launches chrome using the Default profile instead of your regular profile. This has a weird access to the browser cache storage, which does not allow the service worker to pre-cache our resources. Open the url in a "normal" browser profile to get all service worker functionality

Backend for Frontend

Angular supports using an Express HTTP server for server-side rendering (SSR). This project extends that capability by integrating NestJS as a backend framework.

In server.ts, a NestExpressApplication is created:

const app = await NestFactory.create<NestExpressApplication>(ApiModule);

The Express instance is retrieved:

const server = app.getHttpAdapter().getInstance();

This instance is required by AngularNodeAppEngine for SSR:

const angularNodeAppEngine = new AngularNodeAppEngine();
server.use('*splat', (req, res, next) => {
  angularNodeAppEngine
    .handle(req, {
      server: 'express',
      request: req,
      response: res,
      cookies: req.headers.cookie,
    })
    .then((response) => {
      return response
        ? // If the Angular app returned a response, write it to the Express response
          writeResponseToNodeResponse(response, res)
        : // If not, this is not an Angular route, so continue to the next middleware
          next();
    })
    .catch(next);
});

Finally, the NestJS application is initialized:

app.init();

You would also need to expose the request handler so that the angular app engine can properly work:

export const reqHandler = createNodeRequestHandler(server);

This produces an environment which has an api, a database and a frontend fully built and served by angular SSR. If served through the production ready docker image, it also gives a nice lighthouse score: Lighthouse score when running in docker

Note

The Angular SSR process is used as Express middleware. This could potentially be moved into a NestMiddleware for further experimentation.

Database

Entities are autodetected using the .forFeature function in TypeOrmModule

@Module({
  imports: [
    MyModule,
    TypeOrmModule.forRoot({
      type: 'sqlite',
      database: resolve(process.cwd(), 'db', 'home.db'),
      autoLoadEntities: true,
      synchronize: true,
      logging: true,
    }),
  ],
})
export class ApiModule {}

Then include entities in your sub modules:

@Module({
  imports: [TypeOrmModule.forFeature([MyEntity])],
  ...
})
export class MyModule {}

Service Worker

This app is a PWA, requiring a web manifest and a service worker script registered at startup.

One of the things a service worker does is preloading and caching static resources like JavaScript, CSS, and index.html in the client. Angular's built-in ngsw generates a generic service worker at build time. However, for more control, this project uses WorkBox. But WorkBox is not quite compatible with Angulars build process yet.

Challenges with Angular Builds

  • Production Builds (nx build): Static files are available in the dist folder, making it straightforward to generate a pre-cache list.
  • Development Builds (ng serve): Files are built and served in memory, so the dist folder is unavailable.

Solution

This project uses two custom plugins:

  1. Custom esbuild plugin: Runs during nx serve to hook into esbuild's onEnd event. It generates a partial pre-cache list but cannot include CSS files.
  2. Custom webpack plugin: Runs during nx build to generate a complete pre-cache list from files written to disk.

The esbuild plugin is configured in project.json:

  "targets": {
    "build": {
      "executor": "@nx/angular:application",
      "options": {
        "plugins": ["apps/dash/builders/custom-esbuild.ts"],

For production builds, the pre-cache is overwritten using:

nx build
bun x webpack --config ./apps/dash/builders/webpack.config.js

This ensures an active service worker during both development and production, enabling testing of service worker-specific code without requiring production builds.

Utilities

This repository includes reusable utilities. Feel free to use anything you find helpful.

Browser API Helpers

rxjs Utilities

  • Cache operator: Caches observable results for reuse across subscribers

Other

Dashboard System

This app features a dashboard of mini-applications (widgets), each lazily loaded. While individual routes can use Angular's loadChildren, displaying multiple widgets in a single view (dashboard) requires additional logic.

The dashboard view uses a widget-loader to load widgets dynamically. Widgets are only loaded when instructed, minimizing client-side resource usage.

The widget service references widget routes to determine available widgets and their loading mechanisms. The dashboard view fetches a configuration from the backend (an array of widget names) and creates one widget-loader per widget. This approach also supports fullscreen widget routes.

This system allows for multiple dashboard configurations tailored to different needs.

Widgets

Integrations

Third-party integrations use a reverse proxy configured in the SSR Express server:

import { createProxyMiddleware } from 'http-proxy-middleware';
import { proxyRoutes } from './proxy.routes';

Object.entries(proxyRoutes).forEach(([path, config]) => {
  server.use(path, createProxyMiddleware(config));
});

Two widgets demonstrate integrations with third-party apis:

Canvas and WebGPU Experiments

Two widgets demonstrate canvas and WebGPU effects:

These are simple experiments to explore some new (for me) technologies and techniques.

Transcription Service

A transcription widget uses AI to convert audio to text. To set it up:

winget install --id Python.Python.3.11
python -m pip install --upgrade pip
pip install faster-whisper
python -c "from faster_whisper import WhisperModel; WhisperModel('NbAiLab/nb-whisper-small', device='cpu', compute_type='int8')"

This installs Python and the Whisper AI model. After setup, bun start enables audio transcription (currently Norwegian only).

The transcription process involves:

  1. A Python script for transcription
  2. A backend controller to handle file uploads
  3. A widget for audio input - via microphone (requires permission) or file upload

WebLLM

The WebLLM chat experiment loads and runs a language model entirely in the browser. This utilizes WebGPU for performance. There are small models which could be cheap to run on low performance graphics systems, but here I'm using a medium model which performs best on high-performance GPU's like nvidia or amd. It can be a bit sluggish on low-end GPU's like intel.

Most systems have more than one GPU on their system. One integrated on the motherboard and a second more powerful GPU mounted as an expansion card. The integrated one will on most systems struggle a bit to run the selected model here, so for best performance - open your OS graphics settings, select advanced settings, choose your browser and enable "High-performance" graphics processor for that program.

About

Proof of concept playground for building a full-stack application with Angular and NestJS

Topics

Resources

Stars

Watchers

Forks