whitepaper semiwiki ad jitter

Student R&D support

Student R&D support
by Admin on 01-13-2026 at 6:41 pm

Your mission

We are looking for a motivated student assistant to support our R&D team in setting up and maintaining large language model (LLM) inference environments and related API services. The role involves hands-on work with modern inference frameworks and GPU-based infrastructures, both cloud-hosted and on-premises.

  • Setting up, configuring, and maintaining LLM inference frameworks such as vLLMTensorRT-LLMllama.cppOllama, and SGLang.
  • Deploying and managing API endpoints for model inference on self-hosted GPU servers and cloud GPU instances (e.g., RunPod, Hetzner, AWS).
  • Performing DevOps-related activities such as container setup, port forwarding, reverse proxy configuration, and HTTPS endpoint deployment.

Your profile

  • Enrolled student in Computer Science, Electrical Engineering, Data Science or related field.
  • Solid knowledge of Linux environments and shell scripting.
  • Experience with DockerPython, and basic networking and SSH concepts (e.g., ports, reverse proxies, secure connections).
  • Experience with local LLM serving frameworks such as llama.cpp, vLLM, Ollama, or TensorRT-LLM as well as familiarity with GPU-based computation, including CUDA, driver management, and hardware resource monitoring would be a strong plus.

About us

LUBIS is a fast-growing German startup redefining how the semiconductor industry works. We tackle one of its hardest challenges — ensuring complex chips work flawlessly before
they’re built.

Our mission is simple: to transform verification from a craft into a system. By structuring how teams work and automation we make chip design faster, reliable, and bug-free.

LUBIS isn’t just improving the process — we’re defining how verification is done.

Apply for job

To view the job application please visit lubis-eda.jobs.personio.de.

Share this post via: