derhuerst / hamburg-gtfs-rt-server

Expose Hamburg transit data as a GTFS-RT feed.

Home Page:https://v0.hamburg-gtfs-rt.transport.rest

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

hamburg-gtfs-rt-server

Deprecated. The HVV HAFAS API has been shut off.

If you happen to port the code to Geofox's getVehicleMap API, or know of anyone else who has done it, please let me know!


Poll the HVV HAFAS endpoint to provide a GTFS Realtime (GTFS-RT) feed for Hamburg.

Prosperity/Apache license support me via GitHub Sponsors chat with me on Twitter

This project uses hafas-client & hafas-gtfs-rt-feed to fetch live data about all vehicles in the Hamburg & surroundings bounding box and build a live GTFS Realtime (GTFS-RT) feed from them.

Installing & running

Note: hafas-gtfs-rt-feed, the library used by this project for convert for building the GTFS-RT feed, has more extensive docs. For brevity and to avoid duplication (with e.g. berlin-gtfs-rt-server), the following instructions just cover the basics.

Prerequisites

hamburg-gtfs-rt-server needs access to a Redis server, you can configure a custom host/port by setting the REDIS_URL environment variable.

It also needs access to a PostgreSQL 12+ server; Pass custom PG* environment variables if you run PostgreSQL in an unusual configuration.

It also needs access to a NATS Streaming server (just follow its setup guide); Set the NATS_STREAMING_URL environment variable if you run it in an unusual configuration.

The start.sh script requires at least Bash 5.0 to run (because it uses 5.0); macOS currently bundles Bash 3.2, so use brew install bash to install an up-to-date version.

git clone https://github.com/derhuerst/hamburg-gtfs-rt-server.git
cd hamburg-gtfs-rt-server
npm install

Building the matching index

npm run build

The build script will download the latest HVV GTFS Static data and import it into PostgreSQL. Then, it will add additional lookup tables to match realtime data with GTFS Static data. psql will need to have access to your database.

Running

Specify the bounding box to be observed as JSON:

export BBOX='{"north": 53.6744, "west": 9.7559, "south": 53.3660, "east": 10.2909}'

hamburg-gtfs-rt-server uses hafas-gtfs-rt-feed underneath, which is split into three parts: polling the HAFAS endpoint (monitor-hafas CLI), matching realtime data (match-with-gtfs CLI), and serving a GTFS-RT feed (serve-as-gtfs-rt CLI). You can run all three at once using the start.sh wrapper script:

./start.sh

In production, run all three using a tool like systemctl, forever or Kubernetes that restarts them when they crash.

via Docker

A Docker image is available as derhuerst/hamburg-gtfs-rt-server.

Note: The Docker image does not contain Redis, PostgreSQL & NATS. You need to configure access to them using the environment variables documented above (e.g. NATS_STREAMING_URL).

export BBOX='{"north": 53.6744, "west": 9.7559, "south": 53.3660, "east": 10.2909}'
# build the matching index
docker run -e BBOX -i -t --rm derhuerst/hamburg-gtfs-rt-server ./build.sh
# run
docker run -e BBOX -i -t --rm derhuerst/hamburg-gtfs-rt-server

via docker-compose

The example docker-compose.yml starts up a complete set of containers (hamburg-gtfs-rt-server, Redis, PostGIS/PostgreSQL, NATS Streaming) to generate a GTFS-RT feed for hvv.

Be sure to set POSTGRES_PASSWORD, either via a .env file or an environment variable.

The environment may be started via

$ POSTGRES_PASSWORD=mySecretPassword docker-compose up -d

After starting, the GTFS-RT feed should be available via http://localhost:3000/.

Note: Currently, build.sh hard-codes the URL of the GTFS feed used for matching HAFAS realtime information, because Hamburg's transparency portal doesn't provide a permanent URL for the latest dataset yet. You should check if a newer version is available and possibly update the url.

inspecting the feed

Check out hafas-gtfs-rt-feed's inspecting the feed section.

metrics

Check out hafas-gtfs-rt-feed's metrics section.

License

This project is dual-licensed: My contributions are licensed under the Prosperity Public License, contributions of other people are licensed as Apache 2.0.

This license allows you to use and share this software for noncommercial purposes for free and to try this software for commercial purposes for thirty days.

Personal use for research, experiment, and testing for the benefit of public knowledge, personal study, private entertainment, hobby projects, amateur pursuits, or religious observance, without any anticipated commercial application, doesn’t count as use for a commercial purpose.

Buy a commercial license or read more about why I sell private licenses for my projects.

About

Expose Hamburg transit data as a GTFS-RT feed.

https://v0.hamburg-gtfs-rt.transport.rest

License:Other


Languages

Language:Shell 44.3%Language:JavaScript 37.5%Language:Dockerfile 18.2%