Experience

  1. RNN Analysis on Same-Different Task

    Chinese University of Hong Kong

    Advisor: Dr. Xiangbin Teng

    • Model Training: Trained RNNs on the same-different task under varying noise levels by neurogym, optimizing the code for readability and extensibility.
    • Model Analysis: Analyzed normalized averages and principal components (PCA) of RNN hidden states, performed linear fitting of activities at different time points to stimuli values, and analyzed the temporal scope.
  2. Large Model Based Crossmodal Chinese Poetry Creation

    School of Computer Science, Wuhan University

    Advisor: Dr. Weiping Zhu

    • System Development: Led the development of modules supporting cross-modal text and image inputs, enhancing iterative optimization mechanisms.
    • System Evaluation: Evaluated poem quality across different input modalities and optimization on three poem sets.
  3. Data Analysis on Forward Flow Task

    NKLCNL, Beijing Normal University

    Advisor: Prof. Yunzhe Liu

    • Data Prepocessing: Pre-processed word data for forward flow tasks, inserting seed words, removing duplicates, and generating embeddings.
    • Correlation Analysis: Analyzed the correlation between participants’ scale scores and statistical indicators, including sequence length, embedding similarity, optimality divergence, semantic distance range, and “forward flow”.

Education

  1. MS Computational Neuroscience

    University of Tübingen
  2. BEng Software Engineering

    Wuhan University

    GPA: 3.73/4.0

    Courses included:

    • Advanced Mathmatics, Discrete Mathematics, Linear Algebra, Probability & Statistics.
    • Data Structures, Computer Organization, Operating Syustems, Database Systems, Computer Networks.
    • Machine Learning
Skills & Hobbies
Technical Skills
Python
Machine Learning
Hobbies
Hiking
Reading
Awards
Student of Computational Neuroscience
NeuroMatch Academy ∙ July 2024
I studied the foundational concept of computational neuroscience through active learning in groups. The curriculum spans most areas of computational neuroscience, including Machine Learning, Dynamical Systems, Stochastic Processes, and how to model. To finish the course, I worked with partners on the project The Working Memory Capacity of RNN.
See certificate
Languages
100%
Chinese (Mandarin)
50%
English