바로가기 메뉴
본문 바로가기
푸터 바로가기
TOP

첨단융합 최고위과정

차세대지능형반도체전공 주임교수

김재준  교수

JAE-JOON KIM

위치공과대학 301동 805호
연락처02-880-1803
이메일kimjaejoon@snu.ac.kr
학위박사 / 전기 및 컴퓨터공학, 퍼듀대학교
분야뉴럴 네트워크 경량화, 딥러닝 가속기 회로 설계
학력
– 박사: 퍼듀대학교 전기및컴퓨터공학과 (2004)
– 석사: 서울대학교 전기공학부 (1998)
– 학사: 서울대학교 전자공학과 (1994)
주요 경력
– 2004.05~2013.01: IBM T. J. Watson Research Center Research Staff Member
– 2013.02~2019.02: POSTECH 부교수
– 2019.03~2021.08: POSTECH 교수
– 2021.09~현재: 서울대학교 교수
최근 연구 성과
– Hyesung Jeon*, Seojune Lee*, Beomseok Kang, Yulhwa Kim, and Jae-Joon Kim, “QWHA: Quantization-Aware Walsh-Hadamard Adaptation for Parameter-Efficient Fine-Tuning on Large Language Models,” International Conference on Learning Representations (ICLR), Apr. 2026 (*equally contributed).
– Seonghwan Choi*, Beomseok Kang*, Dongwon Jo, Jae-Joon Kim, “Retrospective Sparse Attention for Efficient Long-Context Generation,” International Conference on Learning Representations (ICLR), Apr. 2026 (*equally contributed).
– Munhyeon Kim, Sukhyun Choi, Yulhwa Kim, Jae-Joon Kim, “3D Integration of Hybrid IGZO/Si and IGZO eDRAMs for High-Density/High-Performance On-Chip Memory,” Design Automation and Test in Europe (DATE), Apr. 2026 (Nominated for the Best Paper Award).
Education
– Ph.D.: Purdue University (2004): Electrical and Computer Engineering
– M.S.: Seoul National University (1998): Electrical Engineering
– B.S.: Seoul National University (1994): Electronics Engineering
Professional Experience
– 2004.05~2013.01: Research Staff Member, IBM T. J. Watson Research Center
– 2013.02~2019.02: Associate Professor, POSTECH
– 2019.03~2021.08: Professor, POSTECH
– 2021.09~Present: Professor, Seoul National University
Recent Research Achievements
– Hyesung Jeon*, Seojune Lee*, Beomseok Kang, Yulhwa Kim, and Jae-Joon Kim, “QWHA: Quantization-Aware Walsh-Hadamard Adaptation for Parameter-Efficient Fine-Tuning on Large Language Models,” International Conference on Learning Representations (ICLR), Apr. 2026 (*equally contributed).
– Seonghwan Choi*, Beomseok Kang*, Dongwon Jo, Jae-Joon Kim, “Retrospective Sparse Attention for Efficient Long-Context Generation,” International Conference on Learning Representations (ICLR), Apr. 2026 (*equally contributed).
– Munhyeon Kim, Sukhyun Choi, Yulhwa Kim, Jae-Joon Kim, “3D Integration of Hybrid IGZO/Si and IGZO eDRAMs for High-Density/High-Performance On-Chip Memory,” Design Automation and Test in Europe (DATE), Apr. 2026 (Nominated for the Best Paper Award).