-
The data was analysed in a jupyter notebook data-analysis.ipynb
-
The data was ingested to a sqlite database by parsing the XML Data and inserting the records accordingly to the database using python. The code for this lies in ingest.py
The file can be run to ingest data from any XML File having a similar format to a sqlite database using the following Command line arguements :
python3 ingest.py <XML-File-Location> <DatabaseFile-Name>
In case no arguements are provided, the default arguements would be used. Currently, the data of posts has been ingested in bio-info.db file.
- The API was made in Flask.
i. Enter the following commands once in terminal to create a virtual environment and install the dependencies :
pipenv shell
pipenv install -r requirements.txt
virtualenv run
ii. Enter the following command to run the server :-
env FLASK_ENV=development FLASK_APP=app.py flask run
iii. The API documentation can be viewed here
- The frontend was made in React.
i. Enter the following commands to move to 'frontend' directory and install the dependencies :
cd frontend
npm install
ii. To run the frontend in browser, enter the following command :
npm start