Yi Tay

Yi is the chief scientist and a cofounder of Reka. Previously, he was a senior research scientist at Google Brain working on large language models and artificial intelligence. During his time at Google, he contributed to many industry defining LLMs such as PaLM-1/2, Flan, UL2, LaMDA/Bard & MUM. Moreover, Yi also contributed to large multimodal efforts within Google such as Vit-22B and PaLi-X. Notably, Yi was also the co-lead (modeling & architecture) on PaLM-2.

Before joining the Brain team, he was the tech lead of a research team at Google AI working on transformer architectures, scaling, and modeling. During this time, Yi made contributions to about ~20 product launches to Google product and services.

Before joining Google, he earned his PhD from NTU, Singapore, winning the best PhD thesis award for his contributions to “neural architectures for natural language understanding”. He has also won 3 best paper awards (including runner-ups) in his research career thus far, i.e., ICLR 2021, WSDM 2021 Runner-Up and WSDM 2020 Runner-Up.