Body

AI Systems Innovator Yuke Wang Joins Rice CS Faculty

Wang’s research advances efficient deep learning, secure AI-generated content, and scalable AI systems

Yuke Wang

Assistant Professor Yuke Wang joins Rice’s faculty for the Fall 2025 semester after earning his PhD from the University of California, Santa Barbara (UCSB) in 2024. Before his PhD, Wang earned a B.E. in Software Engineering from the University of Electronic Science and Technology of China in 2018. 

With professional experience at several big tech companies under his belt, Wang hopes to continue his research into deep learning systems at Rice while creating meaningful coursework for his students. 

He interned with both Microsoft and NVIDIA, received the prestigious NVIDIA Graduate Fellowship while at UCSB in 2022, and received a Google Junior Faculty Award just this year. While at UCSB, he worked with Amazon AWS as a postdoctoral scientist for just under a year before coming to Rice.

Making deep learning programs perform better is the focus of many of his current projects, and he has already contributed to two papers accepted to the 2025 Symposium on Operating Systems Principles (SOSP). One of them, called HeteRAG, explores how to make the typical way we interact with deep learning systems more individualized to the user. 

The problem he’s currently working on is twofold: find a way for deep learning models to perform well with conventional hardware, and then scale that method up to operate on multiple devices. In his talk at Rutgers’ Efficient AI (REFAI) Seminar in February, he explained the problem in more detail. 

The graphics processing units (GPUs) in today’s servers or workstations, Wang explains, are often underutilized when running multiple image diffusion tasks, especially if those tasks request images at very different resolutions. Some workloads demand more memory and computing power than others, leaving valuable GPU resources idle. But he and his research team devised a method to fuse requests of varying resolutions into a unified workload, ensuring the hardware is used far more efficiently.

“The question we’re trying to answer is, can we design systems that seamlessly combine these heterogeneous requests so that GPUs are fully leveraged?” Wang said. “And in these scenarios, can we extend our acceleration techniques to not only handle single high-resolution jobs, but also scale effectively across mixed-resolution workloads?”

Wang is also working to make AI-generated content more secure, exploring a way to digitally watermark it so that it can be verified. Since generated content can be intercepted and altered before it reaches the user, the watermark would act as a kind of serial number, offering another layer of security by verifying that the content people receive hasn’t been changed. 

“For each of the generators, we have its own unique tag, like a watermark. It’s embedded in the image but will not affect the virtual effects of the image,” said Wang.

Other current projects include tackling the problem of 4K video generation and fine-tuning the way large language models (LLMs) retrieve information for us. A paper on 4K video generation, in which Wang and his team used existing lower-resolution video generators working in tandem to create a more refined product, will be released soon. 

Wang said he came to Rice from the corporate world for the creative freedom. In big tech, he explained, the ideas you work on are often dictated by the company and usually profit-motivated. In academia, there’s more flexibility to pursue personal curiosity. 

“I prefer a research-style environment with more freedom to determine what we want to explore, compared to the situation in a company where what you work on is often determined for you,” Wang said. “In the industry, it’s like, ‘we’ll define the problem’ or ‘we’ll study the technique and how to implement it.’”

In his first year at Rice, Wang plans to continue his research and create a curriculum that will motivate his students. He also wants to secure more funding—researching AI efficiency takes a lot of cutting-edge, expensive hardware. There are also ongoing collaborations with Google, Cisco, NVIDIA, the University of Washington, UT-Austin, Texas A&M University, and UC San Diego that he’s excited to pursue. 

John Bogna, contributing writer