Hi, I am Chandrav Rajbangshi, currently a 3rd year B.Tech student in Computer Science at Birla Institute of Technology, Mesra with a keen interest in Artificial Intelligence. Currently I am actively working on creating impactful NLP solutions and exploring how to improve 'Alignment' in LLMs. I am open to research and industry opportunities starting after May 2026.

Education
Birla Institute of Technology, Mesra
B.Tech in Computer Science and Engineering
(2023 - 2027)
Sai Vikash Junior College
Higher Secondary Education
(2021 - 2023)
Experience
AI Research Intern, RAAPID Inc.
Co-developed GRIT: a novel fine-tuning method achieving 42% fewer updated parameters and 29% faster training than QLoRA, while maintaining SOTA accuracy.
(May 2025 - Present)
Latest Updates
  • May 2025 Started my AI research internship at RAAPID Inc., focusing on developing novel fine-tuning techniques and improving llm performance on medical evidence detection and ICD 10 cm code generation.
Publications
Submitted to ACL 2026 · Under Review
A parameter-efficient fine-tuning method that reduces catastrophic forgetting by concentrating updates along curvature-informed directions.
Projects
Conversational RAG With PDFs
Built a RAG system to query 100+ PDFs with contextual memory, achieving 85%+ retrieval accuracy. Implemented advanced embedding techniques and vector databases for intelligent document querying with conversation history.
View on GitHub
Fine-tune-BERT
Fine-tuned the BERT model for a text classification task, achieving 92% accuracy on the target dataset. Implemented custom training loops and tokenization strategies to optimize for downstream NLP tasks.
View on GitHub
LoRA-from-scratch
Implemented the Low-Rank Adaptation (LoRA) technique from scratch for efficient fine-tuning of large language models. This project demonstrates a deep understanding of parameter-efficient fine-tuning (PEFT) methods.
View on GitHub
Transformer-from-scratch
Developed a complete Transformer neural network from scratch in Python, implementing the self-attention mechanism, positional encoding, and the full encoder-decoder architecture to understand the model's core components.
View on GitHub
CaptionFlow
Hybrid CNN-Transformer model for image captioning, improving BLEU score by 15%. Built with advanced attention mechanisms and transfer learning techniques for generating accurate and contextual image descriptions.
View on GitHub
PneumoNet
Designed a Convolutional Neural Network (CNN) to detect pneumonia from chest X-ray images with 95% accuracy. Utilized transfer learning and data augmentation to enhance model robustness and performance.
View on GitHub
Hand-sign-number-detector
Created a real-time hand sign number detector using TensorFlow and OpenCV. The model accurately classifies numerical hand gestures from a live webcam feed, showcasing practical computer vision application.
View on GitHub