How to handle a file with hundreds of MBs of JSON data (1,000,000 records).
Possible options:
- To stream the JSON using Node Streams (Discarded: It's complex and there are easier ways to do it)
- To use a database:
- Since the data were in JSON format, it seemed natural to import them into a No-SQL database like Mongodb
- Where to host the data?
- Using MongoDB Stitch Service (Serverless Functions) (Discarded: Too simple. Incomplete Mongodb API)
- Using a full-fledged MongoDB database in the cloud
Features:
- Backend
- Next.js Framework (with Server Side Rendering capabilities)
- REST API with one endpoint: https://book-list.now.sh/api/books
- Query string parameters can be passed to the
/books
endpoint to sort, filter and paginate the records - Isomorphic Javascript
- Frontend
- ReactJS
- Lazy loaded images (Only images in the visible portion of the page are loaded)
- Material UI
- Responsive Design
- Use of Repository Pattern
- Proof of concept of "Infinite Scroll"
- Database
- Testing
- Link →
-
Download or clone the source code
-
Change the path to the local source code directory
-
Install the dependencies running in the command line:
npm run install
npm run dev
- Open http://localhost:3000 in the browser and load the application.
npm run test
- Due to time constrains, only
e2e
tests has been implemented because integration tests have priority in case of apps based inRESTful APIs
- For the same reason, not all cases have been tested, but the remaining ones are similar
- The addition of indexes to the database for sorting and filtering is very important. In other case the database will return an error caused by exceeding the maximum memory limit needed to process the number of records (1,000,000)
- Link →