Codes & Data

A paper without accessible codes and data is a pure paper; Otherwise, it is beyond a paper, maybe a work of art. Our team has released 170 open-source repositories, and all related code and data are available at https://github.com/iLearn-Lab.

Year Published at Title Paper Codes and Data
2017WWWNeural Collaborative Filtering PDFGitHub stars
2026-A Survey on Diffusion Policy for Robotic Manipulation PDF GitHub stars
2019ACM MMMMGCN: Multi-modal Graph Convolution Network for Personalized Recommendation of Micro-video PDF GitHub stars
2025ACM TOISA Comprehensive Survey on Composed Image Retrieval PDF GitHub stars
2017CVPRSCA-CNN: Spatial and Channel-Wise Attention in Convolutional Networks for Image Captioning PDFGitHub stars
2025NeurIPSCogVLA: Cognition-Aligned Vision-Language-Action Models via Instruction-Driven Routing & Sparsification PDF GitHub stars
2024NeurIPSOptimus-1: Hybrid Multimodal Memory Empowered Agents Excel in Long-Horizon Tasks PDF GitHub stars
2025ACLAdaReTaKe: Adaptive Redundancy Reduction for Long-Context Video-Language Understanding PDF GitHub stars
2026TPAMIA Survey on Video Temporal Grounding with Multimodal Large Language Model PDF GitHub stars
2018WWWTEM: Tree-enhanced Embedding Model for Explainable Recommendation PDFGitHub stars
2021ACM MMContrastive Learning for Cold-Start Recommendation PDF GitHub stars
2025ACM MMUAV-ON: A Benchmark for Open-World Object Goal Navigation with Aerial Agents PDF GitHub stars
2021SIGIRDynamic Modality Interaction Modeling for Image-Text Retrieval PDF GitHub stars
2025ACLGUI-explorer: Autonomous Exploration and Mining of Transition-aware Knowledge for GUI Agent PDF GitHub stars
2024ACM MMDiffusion Facial Forgery Detection PDF GitHub stars