Fine-tuning diffusion models with limited data T Moon, M Choi, G Lee, JW Ha, J Lee NeurIPS 2022 Workshop on Score-Based Methods, 2022 | 36 | 2022 |
A Simple Early Exiting Framework for Accelerated Sampling in Diffusion Models T Moon, M Choi, EG Yun, J Yoon, G Lee, J Cho, J Lee International Conference on Machine Learning (ICML), 2024 | 17* | 2024 |
Hyperclova x technical report KM Yoo, J Han, S In, H Jeon, J Jeong, J Kang, H Kim, KM Kim, M Kim, ... arXiv preprint arXiv:2404.01954, 2024 | 6 | 2024 |
Rare-to-Frequent: Unlocking Compositional Generation Power of Diffusion Models on Rare Concepts with LLM Guidance D Park, S Kim, T Moon, M Kim, K Lee, J Cho International Conference on Learning Representations (ICLR), 2024 | 2 | 2024 |
How to Move Your Dragon: Text-to-Motion Synthesis for Large-Vocabulary Objects W Lee, J Jeong, T Moon, HJ Kim, J Kim, G Kim, BU Lee arXiv preprint arXiv:2503.04257, 2025 | | 2025 |
Efficient Generative Modeling with Residual Vector Quantization-Based Tokens J Kim, T Moon, K Lee, J Cho arXiv preprint arXiv:2412.10208, 2024 | | 2024 |