YENCHICHLEE / backend-social-network

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Backend Social Network

This is a Backend application of a social network project used in my Youtube videos.

https://www.youtube.com/playlist?list=PLab_if3UBk9-TuyqwMy3JvNHeCh7Ll9Wz

Chapter 1

In the first video, I've created the Maven project with Intellij and described the folder structure and the pom.xml of the empty Maven project.

Then, I've converted the empty project to a Spring Boot project by adding the Spring Boot dependency as the parent project. This way, my project will inherit all the architecture and configuration of a Spring Boot project. After that, I've also added a Spring Boot Web dependency letting know that this project will use the Web layer.

Finally, I've created the main method, run the project and saw that the Tomcat web server is ready to accept requests.

Chapter 2

In the second video of this playlist, I've created the packages to have a 3-tiers architecture, having: a presentation layer where the controllers will be located, the logic layer where the services will be located, and the data layer will the data structure of the database will be located.

I've also created the controllers to accept the HTTP requests from the frontend application, created the requests mapping to map each URL to a method. And I've created the services where all the business logic will be placed, but leave it empty as I'm missing the database configuration to fetch the data, so I will complete the services in another video.

I've also injected the services into the controllers using the dependency injection of Spring.

I've created a package DTO, Data Transfer Object, where are placed the objects which will be sent to and from the frontend. This avoids me so send the objects which represents the database structure, hiding the database structure to the Internet and only showing what I want. I will need to map the data objects to the dto objects, but we will handle this later with some useful libraries.

Chapter 3

In this third video, I've added the authentication using JWT. The authentication was divided in three steps: the HTTP filter, the provider and the entry point. The HTTP filter will intercept the HTTP requests to read the credentials from the sign endpoint or read the Bearer token from the rest of the endpoints. The provider will search for the user information giving the credentials or token from the previous step. And the entry point will return a custom error when an authentication problems occurs.

There is two way to authentication: with the credentials, login and password; or with the Bearer token. The credentials are only sent at the signIn or singUp endpoints and will return the user information with a created Bearer token. For the rest of the requests, the previously obtained token will be sent in the Authorization header to authenticate the user.

The advantage of the JWT is that it is stateless. The token itself contains the information about the user and the validity of the token. I only need the user to be stored in the database, the rest of the information comes inside the token.

Chapter 4

In this fourth video, I've configured the database connexion with JPA. I've used the initialization-mode to always to build the database everytime the application starts. This may be a problem unless you write your SQL queries taking into account that the file was already executed.

I've created the Java entities to be mapped against the database. I've mapped each column and created the one-to-many, many-to-one relationships between the tables. There was also a table that is connected with itself in a many-to-many relationship. The many-to-many relationships require an intermediary table to make the connection.

And finally, I've created the Spring JPA repositories to read the data from the database. I've created those repository with methods which build the database just with the method naming, and I've also created other methods with custom queries.

Chapter 5

In this fifth video I've added two useful libraries: Lombok and Mapstruct.

Lombok is used to generate getters, setters, constructors, builders and more with some annotations at the class level. This will reduce writting the POJO more quickly, only specifying the fields and the annotations. The rest will be automatically generated by Lombok.

Mapstruct is used to map two objects field by field. I use Mapstruct to map the entity objects to the DTO objects. As I don't want entity objects to be returned by the controllers to manage the privacy, I use DTOs instead. Mapstruct, with some annotations, map field by field the incoming object to the outgoing object.

Chapter 6

In the sixth video, I introduce Liquibase, a database schema manager. Liquibase will allow you to perform modifications on your database (schema or data) in a secured manner. You can save the modifications in changesets that will be run with the Liquibase Maven plugin. The changesets are immutable and should be with a rollback command to allow Liquibase to perform a satisfactory rollback if needed.

The changesets are separated by file, usually named with the application number, by author and with a sequence number. Those changesets must not be modified, a Liquibase error will be thrown and Liquibase will be blocked if a changeset that already run in the database changed its content.

Liquibase also has the rollback command to return to a previous state of the database. The rollback command only works if each changeset has an associated rollback query. When running the rollback command, you must specify how many changesets you want to revert.

Chapter 7

In this seventh video, I show you the aspect oriented programming with the RestException handler. This way, all the controllers will have their exceptions intercepted without any additional code.

The aspect oriented programming surronds a part of the application. This way, when exiting from the controller with an application, the aspect will catch it.

What I pretend to do with the RestException handler is to catch the functional exceptions and return a JSON document with some information about the error.

Chapter 8

In the eighth video, I explain how to properly upload an image to Spring Boot using the objet MultipartFile. This object will help me to save the file in the machine and obtain some information about the file (as the size and the name).

Nevertheless, storing the image in the local machine is not the best idea. I explain why you should use a CDN (Content Delivery Network) or a NFS (Network File System) to share the image in multiple serveres all around the world. This way, the download of the image will be faster as coming from the backend.

Meanwhile, I show the usage of the annotation @Value to inject configuration values into variables during the runtime.

Chapter 9.1

Having the main parts of the application developped, I introduce now how to wirte the tests. This video handles not only the unit tests, but the integration tests.

I start adding the required dependencies: JUnit5, Mockito and Spring Test. As the topic is too large for a single video, I've split in two videos: this one to configure the needed dependencies and write the unit test and the tests against the services; and the second one to write the tests against the controllers and the repositories.

For the unit tests, I show how to use the assertion and the reflection is I need to inject a field inside a bean. For the service tests, I show how Mockito inject mocks and spies to the service to test.

Chapter 9.2

Let's now take into account the controllers and repositories to test. Those tests are more complicated because I will need the Spring context.

For the controllers, I will load the Spring context to have the endpoints available. Then, I have MockMvc to perform the HTTP request and the assertions on the response. As with mockito, I can inject mocks and beans, but I have to use another annotation.

When talking about the database, I've chosen H2 in-memory database. This way, I don't need a real database to be running when running the tests. The H2 database will be created just for the tests, and destroyed at the end. Here, I will also need the Spring context to handle the repositories. And I must indicate to Liquibase to work with H2 to inject data into the database, this way, the tests will be more real, with some exisiting data.

Chapter 9.3

In this chapter, i've introduced the usage of Podam to create objects with random values. This is very useful for testing purpose, as the fields are set with random values that should be accepted in production.

Nevertheless, care must be taken when creating the factory to avoid infinite loops (creating a child in the parent, which creates again the parent, and the children again, and so on).

I can also create custom factory for specific fields if I want concrete values.

Chapter 9.4

Another chapter for the tests. This time for the code coverage with Jacoco, but covering the unit tests and the integration tests separatly.

First, I need to separate the unit tests and the integration tests in my maven workflow. For that, I've added the surefire and failsafe plugins to run the tests separatly. I must ensure to have the integration tests named with the '*IT.java' suffix to be easily identified. I need to use those plugins because they have much more configurations than Maven (which will be useful to connect jacoco).

When the tests are run separatly, I can configure the jacoco plugin. The javacoco plugin must add an argument to the surefire and failsafe plugins before they run the tests. This argument is the user agent to inspect the tests execution to create the coverage report. At the end of the test phases, jacoco will run another step to generate the reports for both the unit and integration tests separatly.

Chapter 10

In this tenth video, I show how to write logs in the application and how to configure a logger. I will show two loggers (logback and log4j2) and an interface (slf4j). Using an logging interface I don't need to change the code when changing the logging engine.

In the code, I show the different logging levels and how to print variables in an optimal way.

For the configuration, I show for both loggers the different layout to reach a similar result. I also show how to print to a file and to the console at the same time.

Chapter 11

In this video I setup some Github action workflows. I create two separated workflows. One for the pipelines coming from the main branch, which will be deployed. And the second for the remaining branches, to ensure all the tests, integration tests and coverage are respected.

To have separated workflows, I must include and exclude the branches respectively in both workflows to correclty target them. For the deployment workflow, I run two jobs: with some unit tests and with the deploy if the tests are successfull. For that, I need to create a dependency between the two jobs inside this workflow.

To run this, I've used the self-hosted runners of Github. To use a self-hosted runner of Github, I need to create a Docker image in localhost which is linked to the Github repository. This way, all the pipeline will run in my local image.

To create a self hosted runner, I need first to retrieve the runner binnaries from https://github.com/serlesen/backend-social-network/settings/actions/runners/new. Then, I need to configure it with the given token (available in the previous URL). Finally, I need to indicate in my pipelines to run on a self hosted runner. To ease this and avoid depending on the hosted OS, I've created a Docker image to do so as following:

FROM debian:latest

ARG TOKEN=not-set

RUN apt-get update && apt-get install -y curl libgtk-dotnet3.0-cil

ENV RUNNER_ALLOW_RUNASROOT=1

RUN curl -o actions-runner-linux-x64-2.294.0.tar.gz -L https://github.com/actions/runner/releases/download/v2.294.0/actions-runner-linux-x64-2.294.0.tar.gz && \
	tar xzf ./actions-runner-linux-x64-2.294.0.tar.gz
RUN ./config.sh --url https://github.com/serlesen/backend-social-network --token $TOKEN --name linux --work _work --runasservice --disableupdate

CMD ["./run.sh"]

This way, building the image will take as argument the token as following:

docker build . -t github_actions_runner --build-arg TOKEN=<the-token>

And run it as follows:

docker run github_actions_runner

About


Languages

Language:Java 100.0%