Giter Club home page Giter Club logo

[CV] [Homepage] [Transcript]

Self Introduction

I am Jiacheng Luo (罗嘉诚), a junior student majoring in CSE , specializing in Computer Science and Technology at SUSTech. Currently, my academic advisor is Prof. Jianguo Zhang, and my life advisor is Assistant Prof. Bin Zhu.

  • Prof. Jianguo Zhang is the leader of the CVIP Group laboratory at SUSTech and has previously served as a Reader in the School of Science and Engineering at the University of Dundee, UK, as well as the Director of International Cooperation in the Department of Computer Science.
  • Prof. Bin Zhu is an assistant professor and doctoral supervisor of the SPHEM (School of Public Health and Emergency Management) in SUSTech.

The main research areas of the CVIP Group laboratory are computer vision, medical image and information processing, machine learning, and artificial intelligence.

My research interests include Domain Adaptation, Transfer Learning, Parameter-Efficient Fine-Tuning and Large Model Training.

Contact Me

Academic Background

  • Sep. 2021 - Jun. 2025 (expected): Southern University of Science and Technology (BEng.)

Research Interests

Domain Adaptation
    Domain adaptation is a field associated with machine learning and transfer learning. This scenario arises when we aim at learning a model from a source data distribution and applying that model on a different (but related) target data distribution. For instance, one of the tasks of the common spam filtering problem consists in adapting a model from one user (the source distribution) to a new user who receives significantly different emails (the target distribution). Domain adaptation has also been shown to be beneficial for learning unrelated sources. Note that, when more than one source distribution is available the problem is referred to as multi-source domain adaptation.
Transfer Learning
    Transfer learning is a technique in machine learning in which knowledge learned from a task is re-used in order to boost performance on a related task. For example, for image classification, knowledge gained while learning to recognize cars could be applied when trying to recognize trucks. This topic is related to the psychological literature on transfer of learning, although practical ties between the two fields are limited. Reusing/transferring information from previously learned tasks to new tasks has the potential to significantly improve learning efficiency.
Parameter-Efficient Fine-Tuning
    Parameter-efficient Fine-tuning (PEFT) is a technique used in Natural Language Processing (NLP) to improve the performance of pre-trained language models on specific downstream tasks. It involves reusing the pre-trained model’s parameters and fine-tuning them on a smaller dataset, which saves computational resources and time compared to training the entire model from scratch. PEFT achieves this efficiency by freezing some of the layers of the pre-trained model and only fine-tuning the last few layers that are specific to the downstream task. This way, the model can be adapted to new tasks with less computational overhead and fewer labeled examples. Although PEFT has been a relatively novel concept, updating the last layer of models has been in practice in the field of computer vision since the introduction of transfer learning. Even in NLP, experiments with static and non-static word embeddings were carried out early on. Parameter-efficient fine-tuning aims to improve the performance of pre-trained models, such as BERT and RoBERTa, on various downstream tasks, including sentiment analysis, named entity recognition, and question-answering. It achieves this in low-resource settings with limited data and computational resources. It modifies only a small subset of model parameters and is less prone to overfitting.
Large Model Training
    Large model training involves the process of training machine learning or deep learning models that possess a significant number of parameters or exhibit complex architectures. It necessitates substantial computational resources, such as GPUs or TPUs, along with extensive datasets for effective training. Employing optimization algorithms like stochastic gradient descent (SGD) or its variants, large model training iteratively fine-tunes model parameters to optimize performance. Techniques like mini-batch training, regularization, and learning rate scheduling are often employed to enhance convergence and mitigate overfitting. This approach finds widespread application in fields like natural language processing, computer vision, and reinforcement learning, where intricate data patterns require sophisticated models for effective analysis and prediction.

News and Updates

  • [Feb 02,2024] One co-authored paper has been submitted to ICML 2024 for consideration.
  • [Aug 31,2023] My personal academic website is online.
  • [Jul 18,2023] Honored to join the CVIP Group as a formal member and hope to do a good job!
  • [Aug 22,2022] Happy to join the CVIP Group as an unofficial attending student!

Jiacheng Luo's Projects

maystern icon maystern

Something about Maystern(Jiacheng Luo) aka Me

maystern.github.io icon maystern.github.io

Github Pages template for academic personal websites, forked from mmistakes/minimal-mistakes

othellooo icon othellooo

Project for SUSTech Java course. A raytracing OthellooO game based on LWJGL.

teedy icon teedy

Lightweight document management system packed with all the features you can expect from big expensive solutions

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.