Mohamed Naji Aboo's repositories
langchain_llm
LanchainLLM
awscloudwatchlogger
AWS Clodwatch logger implementation using Python and Flask
llama-recipes-nm
Examples and recipes for Llama 2 model
AnsweringQuestionsWithHuggingFaceAndLLM
Answering Questions With HuggingFace And LLM
Automatic-assignment-of-ICD10-codes
Project on the assignment of ICD codes to medical/clinical text
azure-text-analytics-example
Azure Text Analytics
chatgpt3.5_finetune
chatgpt 3.5 finetuning
Complete-MLOps-BootCamp
A Hands on Guide on MLOps Practices to take your model from Laptop to Production. Created by the Author - Nachiketh Murthy
ECG_image_classification
Vit ECG image classification model
falcontune
Tune any FALCON in 4-bit
finetuned-qlora-falcon7b-medical
Finetuning of Falcon-7B LLM using QLoRA on Mental Health Conversational Dataset
GenZ
An instruction finetuned model of Xgen 7B
gettingdata-samples
Samples and demos for Getting Data
google-gemini
Detailed code explanation of google LLM gemini
guardrails
Adding guardrails to large language models.
Indian-LawyerGPT
Fine-Tuning Falcon-7B with QLoRA is a pioneering initiative that leverages cutting-edge NLP techniques to create an advanced AI model with a profound understanding of the Indian legal context.
johnsnowlabs
Gateway into the John Snow Labs Ecosystem
Llama-2-Open-Source-LLM-CPU-Inference
Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A
LLaMA-Efficient-Tuning
Easy-to-use fine-tuning framework using PEFT (PT+SFT+RLHF with QLoRA) (LLaMA-2, BLOOM, Falcon, Baichuan)
llm-rpg
Write your own LOTR story with LORA BLOOM-3B
NeMo-Guardrails
NeMo Guardrails is an open-source toolkit for easily adding programmable guardrails to LLM-based conversational systems.
open-interpreter
OpenAI's Code Interpreter in your terminal, running locally
SuperKnowa
This repository is intended for IBM Ecosystem partners. It contains pluggable components designed to tackle various Generative AI use cases using Large Language Models (LLMs).
xgen
Salesforce open-source LLMs with 8k sequence length.