mlaugharn / search_engine

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

HW: Search Engine

In this assignment you will create a highly scalable web search engine.

Due Date: Sunday, 9 May

Learning Objectives:

  1. Learn to work with a moderate large software project
  2. Learn to parallelize data analysis work off the database
  3. Learn to work with WARC files and the multi-petabyte common crawl dataset
  4. Increase familiarity with indexes and rollup tables for speeding up queries

Task 0: project setup

  1. Fork this github repo, and clone your fork onto the lambda server

  2. Ensure that you'll have enough free disk space by:

    1. bring down any running docker containers
    2. run the command
      $ docker system prune
      

Task 1: getting the system running

In this first task, you will bring up all the docker containers and verify that everything works.

There are three docker-compose files in this repo:

  1. docker-compose.yml defines the database and pg_bouncer services
  2. docker-compose.override.yml defines the development flask web app
  3. docker-compose.prod.yml defines the production flask web app served by nginx

Your tasks are to:

  1. Modify the docker-compose.override.yml file so that the port exposed by the flask service is different.

  2. Run the script scripts/create_passwords.sh to generate a new production password for the database.

  3. Build and bring up the docker containers.

  4. Enable ssh port forwarding so that your local computer can connect to the running flask app.

  5. Use firefox on your local computer to connect to the running flask webpage. If you've done the previous steps correctly, all the buttons on the webpage should work without giving you any error messages, but there won't be any data displayed when you search.

  6. Run the script

    $ sh scripts/check_web_endpoints.sh
    

    to perform automated checks that the system is running correctly. All tests should report [pass].

Task 2: loading data

There are two services for loading data:

  1. downloader_warc loads an entire WARC file into the database; typically, this will be about 100,000 urls from many different hosts.
  2. downloader_host searches the all WARC entries in either the common crawl or internet archive that match a particular pattern, and adds all of them into the database

Task 2a

We'll start with the downloader_warc service. There are two important files in this service:

  1. services/downloader_warc/downloader_warc.py contains the python code that actually does the insertion
  2. downloader_warc.sh is a bash script that starts up a new docker container connected to the database, then runs the downloader_warc.py file inside that container

Next follow these steps:

  1. Visit https://commoncrawl.org/the-data/get-started/
  2. Find the url of a WARC file. On the common crawl website, the paths to WARC files are referenced from the Amazon S3 bucket. In order to get a valid HTTP url, you'll need to prepend https://commoncrawl.s3.amazonaws.com/ to the front of the path.
  3. Then, run the command
    $ ./downloader_warc.sh $URL
    
    where $URL is the url to your selected WARC file.
  4. Run the command
    $ docker ps
    
    to verify that the docker container is running.
  5. Repeat these steps to download at least 5 different WARC files, each from different years. Each of these downloads will spawn its own docker container and can happen in parallel.

You can verify that your system is working with the following tasks. (Note that they are listed in order of how soon you will start seeing results for them.)

  1. Running docker logs on your downloader_warc containers.
  2. Run the query
    SELECT count(*) FROM metahtml;
    
    in psql.
  3. Visit your webpage in firefox and verify that search terms are now getting returned.

Task 2b

The downloader_warc service above downloads many urls quickly, but they are mostly low-quality urls. For example, most URLs do not include the date they were published, and so their contents will not be reflected in the ngrams graph. In this task, you will implement and run the downloader_host service for downloading high quality urls.

  1. The file services/downloader_host/downloader_host.py has 3 FIXME statements. You will have to complete the code in these statements to make the python script correctly insert WARC records into the database.

    HINT: The code will require that you use functions from the cdx_toolkit library. You can find the documentation here. You can also reference the downloader_warc service for hints, since this service accomplishes a similar task.

  2. Run the query

    SELECT * FROM metahtml_test_summary_host;
    

    to display all of the hosts for which the metahtml library has test cases proving it is able to extract publication dates. Note that the command above lists the hosts in key syntax form, and you'll have to convert the host into standard form.

  3. Select 5 hostnames from the list above, then run the command

    $ ./downloader_host.sh "$HOST"
    

    to insert the urls from these 5 hostnames.

Task 3: speeding up the webpage

Since everyone seems pretty overworked right now, I've done this step for you.

There are two steps:

  1. create indexes for the fast text search
  2. create materialized views for the count(*) queries

Submission

  1. Edit this README file with the results of the following queries in psql. The results of these queries will be used to determine if you've completed the previous steps correctly.

    1. This query shows the total number of webpages loaded:

      select count(*) from metahtml;
      
       novichenko=# select count(*) from metahtml;
         count
       ---------
        1824986
       (1 row)
      
    2. This query shows the number of webpages loaded / hour:

      select * from metahtml_rollup_insert order by insert_hour desc limit 100;
      
novichenko=#  select * from metahtml_rollup_insert order by insert_hour desc limit 100;
 hll_count |  url   | hostpathquery | hostpath | host  |      insert_hour
-----------+--------+---------------+----------+-------+------------------------
         4 |  10791 |         10854 |     7115 |     4 | 2021-05-10 05:00:00+00
         4 |  26556 |         26875 |    17480 |     4 | 2021-05-10 04:00:00+00
         4 |  26186 |         26985 |    17433 |     4 | 2021-05-10 03:00:00+00
         4 |  27543 |         26669 |    18488 |     4 | 2021-05-10 02:00:00+00
         4 |  26806 |         27346 |    18842 |     4 | 2021-05-10 01:00:00+00
         4 |  27764 |         27901 |    19494 |     4 | 2021-05-10 00:00:00+00
         4 |  27776 |         27387 |    18558 |     4 | 2021-05-09 23:00:00+00
         5 |  27648 |         26833 |    18299 |     5 | 2021-05-09 22:00:00+00
         4 |  23882 |         24291 |    14843 |     4 | 2021-05-09 21:00:00+00
         3 |  19985 |         19552 |    11873 |     3 | 2021-05-09 20:00:00+00
         3 |  20448 |         20588 |    12363 |     3 | 2021-05-09 19:00:00+00
         3 |  21086 |         20356 |    12315 |     3 | 2021-05-09 18:00:00+00
         3 |  21130 |         20802 |    12435 |     3 | 2021-05-09 17:00:00+00
         3 |  19729 |         20474 |    11584 |     3 | 2021-05-09 16:00:00+00
         3 |  19831 |         19341 |    10481 |     3 | 2021-05-09 15:00:00+00
         3 |  20411 |         21306 |    12184 |     3 | 2021-05-09 14:00:00+00
         3 |  20604 |         20379 |    11575 |     3 | 2021-05-09 13:00:00+00
         3 |  19737 |         19073 |    11419 |     3 | 2021-05-09 12:00:00+00
         3 |  20071 |         20522 |    12368 |     3 | 2021-05-09 11:00:00+00
         3 |  19631 |         20061 |    11624 |     3 | 2021-05-09 10:00:00+00
         3 |  19647 |         18457 |     9861 |     3 | 2021-05-09 09:00:00+00
         3 |  19789 |         18663 |    10697 |     3 | 2021-05-09 08:00:00+00
         3 |  19974 |         19037 |    11020 |     3 | 2021-05-09 07:00:00+00
         3 |  19529 |         18795 |    11079 |     3 | 2021-05-09 06:00:00+00
         3 |  19304 |         19786 |    12025 |     3 | 2021-05-09 05:00:00+00
         3 |  20661 |         19753 |    11800 |     3 | 2021-05-09 04:00:00+00
         3 |  20167 |         20067 |    12660 |     3 | 2021-05-09 03:00:00+00
         3 |  20385 |         19951 |    12410 |     3 | 2021-05-09 02:00:00+00
         3 |  20520 |         19087 |    12406 |     3 | 2021-05-09 01:00:00+00
         3 |  19289 |         19785 |    11904 |     3 | 2021-05-09 00:00:00+00
         3 |  19319 |         19454 |    11174 |     3 | 2021-05-08 23:00:00+00
         3 |  19565 |         19087 |    11579 |     3 | 2021-05-08 22:00:00+00
         3 |  20646 |         19760 |    11725 |     3 | 2021-05-08 21:00:00+00
         3 |  18318 |         17710 |    11569 |     3 | 2021-05-08 20:00:00+00
         3 |  19603 |         18046 |    11242 |     3 | 2021-05-08 19:00:00+00
         3 |  19432 |         19299 |    11395 |     3 | 2021-05-08 18:00:00+00
         3 |  18973 |         19327 |    12619 |     3 | 2021-05-08 17:00:00+00
         3 |  20338 |         19792 |    12126 |     3 | 2021-05-08 16:00:00+00
         3 |  19711 |         20122 |    12438 |     3 | 2021-05-08 15:00:00+00
         3 |  17127 |         17812 |    10116 |     3 | 2021-05-08 14:00:00+00
         3 |  18904 |         19335 |    10791 |     3 | 2021-05-08 13:00:00+00
         3 |  20528 |         19903 |    10973 |     3 | 2021-05-08 12:00:00+00
         3 |  19083 |         19783 |    11206 |     3 | 2021-05-08 11:00:00+00
         3 |  19071 |         20245 |    11718 |     3 | 2021-05-08 10:00:00+00
         3 |  19632 |         19780 |    11501 |     3 | 2021-05-08 09:00:00+00
         3 |  18916 |         19496 |    12111 |     3 | 2021-05-08 08:00:00+00
         3 |  19880 |         19886 |    12474 |     3 | 2021-05-08 07:00:00+00
         3 |  21146 |         19879 |    12099 |     3 | 2021-05-08 06:00:00+00
         3 |  20919 |         21232 |    13216 |     3 | 2021-05-08 05:00:00+00
         3 |  20604 |         19777 |    11528 |     3 | 2021-05-08 04:00:00+00
         3 |  19655 |         19521 |    10990 |     3 | 2021-05-08 03:00:00+00
         3 |  19366 |         19237 |    10543 |     3 | 2021-05-08 02:00:00+00
         3 |  19596 |         19981 |    11424 |     3 | 2021-05-08 01:00:00+00
         3 |  18995 |         19560 |    12377 |     3 | 2021-05-08 00:00:00+00
         4 |  20196 |         21404 |    12903 |     4 | 2021-05-07 23:00:00+00
         4 |  21794 |         20828 |    12668 |     4 | 2021-05-07 22:00:00+00
         4 |  23130 |         23096 |    14609 |     4 | 2021-05-07 21:00:00+00
         4 |  23528 |         23601 |    15073 |     4 | 2021-05-07 20:00:00+00
         4 |  22790 |         22445 |    14156 |     4 | 2021-05-07 19:00:00+00
         4 |  22403 |         21585 |    13387 |     4 | 2021-05-07 18:00:00+00
         4 |  23512 |         22767 |    14251 |     4 | 2021-05-07 17:00:00+00
         4 |  21938 |         20857 |    13858 |     4 | 2021-05-07 16:00:00+00
         4 |  24284 |         22252 |    14139 |     5 | 2021-05-07 15:00:00+00
         4 |  22941 |         21981 |    13984 |     5 | 2021-05-07 14:00:00+00
         4 |  24425 |         22053 |    14672 |     5 | 2021-05-07 13:00:00+00
         4 |  23535 |         22116 |    13419 |     4 | 2021-05-07 12:00:00+00
         4 |  22657 |         21068 |    13513 |     4 | 2021-05-07 11:00:00+00
         4 |  24730 |         22932 |    14821 |     4 | 2021-05-07 10:00:00+00
         4 |  24798 |         23840 |    15512 |     4 | 2021-05-07 09:00:00+00
         4 |  23567 |         23421 |    15513 |     4 | 2021-05-07 08:00:00+00
         4 |   2152 |          2077 |     1243 |     4 | 2021-05-07 07:00:00+00
         2 |  28515 |         28821 |    28222 | 26138 | 2021-05-05 07:00:00+00
         4 | 144890 |        142697 |   123561 | 59911 | 2021-05-05 06:00:00+00
         4 |  86405 |         87289 |    82957 | 47381 | 2021-05-05 05:00:00+00
(74 rows)
1. This query shows the hostnames that you have downloaded the most webpages from:
   ```
   select * from metahtml_rollup_host order by hostpath desc limit 100;
   ```
novichenko=# select * from metahtml_rollup_host2 order by hostpath desc limit 100;
  url   | hostpathquery | hostpath |            host
--------+---------------+----------+-----------------------------
 354970 |        346724 |   343913 | com,latimes)
 241658 |        218458 |   208440 | com,theatlantic)
  42181 |         43515 |    40858 | com,sfchronicle)
  10643 |          3982 |     3720 | us,nautil)
    177 |           176 |      172 | com,theacornonline)
    110 |           110 |      110 | com,pandora)
    124 |           124 |      107 | com,wiley,onlinelibrary)
    106 |           106 |      106 | org,apache,mail-archives)
     89 |            89 |       89 | org,wikipedia,en)
     99 |            99 |       86 | com,freecode)
     86 |            86 |       86 | com,beau-coup)
     88 |            88 |       86 | com,twopeasinabucket)
     83 |            83 |       83 | com,accuweather)
     79 |            79 |       79 | com,grabcad)
     78 |            78 |       78 | fm,last)
     77 |            77 |       77 | com,scribdassets,imgv2-3)
     76 |            76 |       76 | com,sandiegouniontribune)
     75 |            75 |       75 | com,thestreet)
     75 |            75 |       75 | com,tripadvisor,no)
     72 |            72 |       72 | com,bigstockphoto)
     72 |            72 |       72 | com,timesargus)
     71 |            71 |       71 | id,co,tripadvisor)
     70 |            70 |       70 | com,appbrain)
     67 |            67 |       67 | mx,com,tripadvisor)
     67 |            67 |       67 | com,agoda)
     68 |            68 |       67 | com,nymag)
     67 |            67 |       67 | com,pbase)
     66 |            66 |       66 | com,popsugar)
     65 |            65 |       65 | com,upi)
     65 |            65 |       65 | nl,tripadvisor)
     65 |            65 |       65 | com,rutlandherald)
     65 |            65 |       65 | ru,tripadvisor)
     64 |            64 |       64 | eg,com,tripadvisor)
     64 |            64 |       64 | tw,com,tripadvisor)
     70 |            70 |       64 | com,cafemom)
     63 |            63 |       63 | com,tripadvisor,pl)
     63 |            63 |       63 | dk,tripadvisor)
     72 |            72 |       63 | com,dpreview)
     62 |            62 |       62 | com,tmz)
     96 |            96 |       62 | com,bloomberg)
     62 |            62 |       62 | com,gamefaqs)
     62 |            62 |       62 | com,theguardian)
     62 |            62 |       62 | br,com,tripadvisor)
     61 |            61 |       61 | ar,com,tripadvisor)
     64 |            64 |       61 | com,iheart)
     61 |            61 |       61 | com,directorio-foros)
     61 |            61 |       61 | com,scribdassets,imgv2-2)
     61 |            61 |       61 | com,scribdassets,imgv2-4)
     61 |            61 |       61 | com,engadget)
     61 |            61 |       61 | com,tripadvisor,th)
     60 |            60 |       60 | net,sourceforge)
     60 |            60 |       60 | com,grouprecipes)
     64 |            64 |       59 | com,nameberry)
     59 |            59 |       59 | org,debian,lists)
     59 |            59 |       59 | org,worldcat)
     58 |            58 |       58 | com,tripadvisor)
     62 |            62 |       58 | com,snagajob)
     58 |            58 |       58 | com,6pm)
     58 |            58 |       58 | com,foursquare)
     58 |            58 |       58 | org,metoperashop)
     57 |            57 |       57 | org,netbsd,mail-index)
     58 |            58 |       57 | gov,clinicaltrials)
     57 |            57 |       57 | com,gizmodo)
     57 |            57 |       57 | com,newgrounds)
     55 |            55 |       55 | com,tylerpaper)
     55 |            55 |       55 | com,threadless)
     55 |            55 |       55 | fr,tripadvisor)
     55 |            55 |       55 | com,cricketarchive)
     55 |            55 |       55 | gov,loc,chroniclingamerica)
     55 |            55 |       55 | com,scribdassets,imgv2-1)
     56 |            56 |       55 | com,economist)
     55 |            55 |       55 | com,oracle,docs)
     54 |            54 |       54 | org,w3,lists)
     54 |            54 |       54 | com,epicsports,football)
     54 |            54 |       54 | com,causes)
     54 |            54 |       54 | org,wikipedia,es)
     54 |            54 |       54 | com,foodily)
     96 |            96 |       54 | com,nytimes)
     54 |            54 |       54 | com,genealogy)
 347783 |        355916 |       54 | com,ycombinator,news)
     53 |            53 |       53 | net,worldcosplay)
     53 |            53 |       53 | com,partsgeek)
     55 |            55 |       53 | com,aikenstandard)
     53 |            53 |       53 | com,weather)
     53 |            53 |       53 | tr,com,tripadvisor)
     53 |            53 |       53 | gov,delaware,courts)
     52 |            52 |       52 | com,moultrienews)
     52 |            52 |       52 | ve,com,tripadvisor)
     52 |            52 |       52 | com,ebaumsworld)
     52 |            52 |       52 | de,tripadvisor)
     52 |            52 |       52 | kr,co,tripadvisor)
     52 |            52 |       52 | net,themeforest)
     56 |            56 |       52 | com,edmunds)
     52 |            52 |       52 | org,wikipedia,ru)
     54 |            54 |       51 | com,jpcycles)
     68 |            68 |       51 | org,summitpost)
     51 |            51 |       51 | cl,tripadvisor)
     56 |            56 |       51 | com,dafont)
     51 |            51 |       51 | net,blogmarks)
     51 |            51 |       51 | org,votesmart)
(100 rows)
  1. Take a screenshot of an interesting search result. Add the screenshot to your git repo, and modify the <img> tag below to point to the screenshot.

I was just looking for 'python' and I found this..

  1. Commit and push your changes to github.

  2. Submit the link to your github repo in sakai.

About


Languages

Language:PLpgSQL 40.1%Language:Python 39.2%Language:Dockerfile 9.9%Language:HTML 5.7%Language:Shell 3.1%Language:CSS 1.9%