There are 1 repository under cross-entropy-loss topic.
Binary and Categorical Focal loss implementation in Keras.
Code for ICCV2019 "Symmetric Cross Entropy for Robust Learning with Noisy Labels"
A PyTorch implementation of U-Net for aerial imagery semantic segmentation.
Implementation of key concepts of neuralnetwork via numpy
Pytorch Implementations of Common modules, blocks and losses for CNNs specifically for segmentation models
Code for the AAAI 2022 publication "Well-classified Examples are Underestimated in Classification with Deep Neural Networks"
The most basic LSTM tagger model in pytorch; explain relationship between nll loss, cross entropy loss and softmax function.
A Feed Forward Neural Network which a ReLU activation, Cross Entropy Loss & Adam Optimizer
Decision Tree Implementation from Scratch
Code for the Paper : NBC-Softmax : Darkweb Author fingerprinting and migration tracking (https://arxiv.org/abs/2212.08184)
Neural Network to predict which wearable is shown from the Fashion MNIST dataset using a single hidden layer
Maths behind machine learning and some implementations from scratch.
C codes for the Arificial Intelligence Course and algorithms.
A classifier to differentiate between Cat and Non-Cat Images
Breast Cancer Classification with Logistic Regression
Comparison of common loss functions in PyTorch using MNIST dataset
This repository contains two models having Two - layers ANN and L - layers ANN respectively to classify Cat photo and Non-Cat photo. This ANN works on the mathematical principles of Logistic Regression and Cross Entropy.
Neural Networks from scratch (Inspired by Michael Nielsen book: Neural Nets and Deep Learning)
Neural network-based character recognition using MATLAB. The algorithm does not rely on external ML modules, and is rigorously defined from scratch. A report is included which explains the theory, algorithm performance comparisons, and hyperparameter optimization.
Multiclass Classification using Softmax from scratch without any famous library like Tensorflow, Pytorch, etc.
In the project, the aim is to generate new song lyrics based on the artist’s previously released song’s context and style. We have chosen a Kaggle dataset of over 57,000 songs, having over 650 artists. The dataset contains artist name, song name, a link of the song for reference & lyrics of that song. We tend to create an RNN character-level language model on the mentioned dataset. Using model evaluation techniques, the model is checked for its accuracy and is parameter optimized. The trained model will predict the next character based on the context of the previous sequence and will generate new lyrics based on an artist’s style.
MSc Thesis at FER-2021/22 led by izv. prof. dr. sc. Marko Čupić
Фреймворк глубоко обучения на Numpy, написанный с целью изучения того, как все работает под "капотом".
In this X-ray classification assignment, we built a deep learning model to classify chest X-ray images into "nofinding" and "effusion" classes. We tackled challenges like data augmentation, imbalanced classes, and used weighted cross-entropy to improve model performance. The goal was to identify abnormalities with high accuracy.
full visualization of netflix and movielense datasets with 89% accuraccy item2vec
Implemented and trained Siamese network for image classification using contrastive loss, triplet loss and regularized cross-entropy loss
KL severity grading using SE-ResNet and SE-DenseNet architectures trained with Cross Entropy loss and Focal Loss. The hyperparameters of focal loss have been fine-tuned as well. Further, Grad-CAM has been implemented for visualization purposes.
Deep Neural Networks like Single Layer Perceptron and Multi Layer Perceptron implementation using Tensorflow library on Datasets like MNIST and Naval Mine for categorical Classification. Saving and Restoring Tensorflow "Variables" weights for testing.
Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)
Digital Image Processing Course | Home Works Design| Fall 2021 | Dr. MohammadReza Mohammadi
Predict whether the cancer is benign or malignant using logistic regression model.
Implementation of a Fully Connected Neural Network, Convolutional Neural Network (CNN), and Recurrent Neural Network (RNN) from Scratch, using NumPy.
I implemented a CNN to train and test a handwritten digit recognition system using the MNIST dataset. I also read the paper “Backpropagation Applied to Handwritten Zip Code Recognition” by LeCun et al. 1989 for more details, but my architecture does not mirror everything mentioned in the paper. I also carried out a few experiments such as adding different dropout rates, using batch normalization, and using different optimizers in the baseline model. Finally, I discuss the impact of experiments on the learning curves and testing performance.
Machine Learning: Regression and Classification. Andrew presents a course in introduction to machine learning, with practice in regression and classification for the first and second course, and the third course focuses on recommender systems and reinforcement learning.
Built a custom adam scheduler using gradient clipping, LR scheduling, momentum updates, with two different loss functions