Frontend node service for data collected from palantiri
Installing this code base requires the use of the npm
command available within the nodejs install. You will need to install it locally before moving on with the installation.
$ git clone https://github.com/anidata/ht-archive.git
$ cd ht-archive
$ npm install
This guide will walk you through the process of setting up a local PostgreSQL server and importing the database backups so that you can use the ht-archive
web application.
Scroll down for command line installation instructions.
Suggested Method: If you decide to use Docker (install by selecting your OS on this page, https://www.docker.com/products/overview), start PostgreSQL with the following command on the command line after replacing /SOME_FOLDER
with a folder on your computer that is easily accessible:
-
If you haven't already installed Docker, follow the directions on the Docker website: https://www.docker.com/products/overview. Using macports or brew is not suggested.
-
Run the PostgreSQL server
$ sudo docker run -d -e POSTGRES_PASSWORD=1234 -e POSTGRES_USER=dbadmin -e POSTGRES_DB=sandbox \
-v /SOME_FOLDER:/var/lib/postgresql/data -p 5432:5432 --name postgres postgres
-
Download the PostgreSQL backup from one of the following places:
-
Extract the SQL file from the downloaded
crawler_er.tar.gz
, using the following command on the command line or an archive tool:$ tar xzf /THE_FOLDER_WITH_DOWNLOADED_FILE/crawler_er.tar.gz
-
Copy the the
crawler.sql
file that was extracted fromcrawler_er.tar.gz
to/SOME_FOLDER
$ cp /THE_FOLDER_WITH_DOWNLOADED_FILE/crawler.sql /SOME_FOLDER
-
Load the SQL into PostgreSQL
$ sudo docker exec postgres psql --username dbadmin -c "CREATE DATABASE crawler" sandbox $ sudo docker exec postgres psql --username dbadmin -f /var/lib/postgresql/data/crawler.sql crawler
-
Start the web application
a. Start the application with
$ sudo docker run -d -p 8080:8080 --link postgres:postgres --name ht-archive bmenn/ht-archive --db crawler --usr dbadmin --pwd 1234 --host postgres
b. If you have made some changes and want to see the updated application run
$ node app.js --db crawler --usr dbadmin --pwd 1234 --host REPLACE_ME_WITH_DOCKER_OR_POSTGRES_IP
Follow these steps if you already have a PostgreSQL server running locally and would rather not use Docker.
-
Download the PostgreSQL backup from one of the following places:
-
Extract the SQL file from the downloaded
crawler_er.tar.gz
, using the following command on the command line or an archive tool:$ tar xzf /the_folder_with_backup/crawler_er.tar.gz
-
Set up your local database
Log into your postgres server as root and create a new superuser named
dbadmin
with login permissions.postgres> CREATE ROLE dbadmin WITH SUPERUSER LOGIN PASSWORD 1234;
-
Load the SQL into PostgreSQL
Create a database called
crawler
, exit psql and run the following command. Be patient - the query can take 15 minutes or so to run.$ psql --host=localhost --dbname=crawler --username=dbadmin -f <path/to/.sql/file>
-
Once the data has been loaded into PostgreSQL, start the web application and navigate to
localhost:8080
in your browser.$ node app.js --db crawler --usr dbadmin --pwd 1234 --host REPLACE_ME_WITH_DOCKER_OR_POSTGRES_IP
$ node app.js --usr postgres-user --host hostname --db database # the app will launch on port 8080