Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

TikTok Logo

Tech Lead - Research Engineer - Tiktok AI Search - LLM Pretraining/Alignment/Inference

TikTok

$208,800 - $438,000
Sep 22, 2025
San Jose, CA, USA
Apply Now

On the TikTok Search Team, the business and/or technical problem is to develop and apply cutting-edge machine learning technologies in real-time large-scale systems that serve billions of search requests daily, impacting and improving the search experience for hundreds of millions of users globally through advanced NLP and multi-modal models.

Requirements

  • Experience in one or more of the following areas is preferred: NLP, LLM, RL
  • Experience in using data-driven methods to enhance the capability of LLMs through various stages of the model development
  • Experience in RAG, Prompt Engineering or other inference time methods to enhance the performance of the system
  • Proficient coding skills and strong algorithm & data structure basis.

Responsibilities

  • Conduct research and develop state-of-the-art algorithms in various stages of the development of LLM, including continued pretraining, SFT, RLHF;
  • Investigate and implement robust evaluation methodologies to assess model performance at various stages, unravel the underlying mechanisms and sources of their abilities, and utilize this understanding to drive model improvements.
  • Using inference stage techniques such as RAG, CoT, Prompt Engineering to improve the model output
  • Improve the performance of AI Search in the TikTok app to provide better search experience for users
  • Exploring and developing large-scale language models and optimizing enterprise applications to the extreme;
  • Data construction, instruction tuning, preference alignment, and model optimization;
  • Implementation of relevant applications, including content generation, summary etc.;

Other

  • Effective communication and teamwork skills.
  • Candidates with top-tier conference papers, including ICML, NeurIPS, ICLR, CVPR, ICRA, KDD etc., relevant internship experience or winners of ACM competitions are preferred;