Skip to main content

Fall 2025 CRCD Workshops

All workshops will be presented virtually.

CRCD Ecosystem On-Ramp
Thursday Sept. 11, 1-4pm EST

This getting-started session introduces users to CRCD's specialized compute resources and data storage systems. Topics covered include how to access the computing clusters, how to load software, how to schedule jobs with SLURM, and strategies for using the resources effectively.

Highly recommended for new users, and perhaps a good review for more experienced users.

Prerequisite: a CRCD user account
To register: https://pitt.co1.qualtrics.com/jfe/form/SV_b8IRE7KdQqK5GN8

 

Ab initio simulation of molecules and complex materials
Oct. 9, 1-4pm EST

This workshop will provide an overview of the most widely used approaches for modeling the electronic structure of molecules and condensed phases, including quantum-chemistry methods (Hartree–Fock and post-Hartree–Fock theories) and density-functional theory. We will discuss the challenges of modeling infinite periodic systems and describe the principles, advantages, and limitations of different computational techniques for treating electrons in solids. We will also briefly cover the extension of density-functional theory to simulations of atomic motion at finite temperatures using ab initio molecular dynamics (Car–Parrinello and Born–Oppenheimer), as well as a selection of emerging topics, such as the application of machine learning in electronic structure theory and the advent of quantum computing.

To register: https://pitt.co1.qualtrics.com/jfe/form/SV_bfTnAHqfPKaTgbQ

 

Introduction to Quantum Circuits and Algorithms with CUDA-Q
Oct. 23, 1-4pm EST

This workshop offers a practical introduction to the core concepts of quantum computing and quantum algorithms using NVIDIA’s CUDA-Q library. Aimed at interested beginners from all disciplines, the workshop requires only a solid undergraduate-level understanding of mathematics and programming - particularly linear algebra and intermediate level Python - and does not assume in-depth knowledge of quantum mechanics. While all the small-scale circuits and algorithms in this workshop will be simulated on GPU clusters, participants will get a hands-on introduction to some of the most promising applications of quantum algorithms such as Quantum Teleportation, Grover's Search, and Shor's Factorization. A brief outlook into additional examples such as quantum-approximate optimization algorithms (QAOA) will also be given.

Prerequisites: Basic knowledge of quantum mechanics could be an advantage 
To register: https://pitt.co1.qualtrics.com/jfe/form/SV_bfTnAHqfPKaTgbQ

 

CRCD Hosted NVIDIA Workshops

Adding New Knowledge to LLMs
Nov. 12, 2-4pm ET
Maximum Number of attendees: 40

Large Language Models (LLMs) are powerful, but their knowledge is often general-purpose and may lack the specific, up-to-date, or specialized information required for enterprise applications. The "Adding Knowledge to LLMs" workshop provides a comprehensive, hands-on guide to the essential techniques for augmenting and customizing LLMs.

This workshop takes you on a complete journey from raw data to a fine-tuned, optimized model. You will begin by learning how to curate high-quality datasets and generate synthetic data with NVIDIA NeMo Curator. Next, you will dive deep into the crucial process of model evaluation, using benchmarks, LLM-as-a-judge, and the NeMo Evaluator to rigorously assess model performance. With a solid foundation in evaluation, you will then explore a suite of powerful customization techniques, including Continued Pretraining to inject new knowledge, Supervised Fine-Tuning to teach new skills, and Direct Preference Optimization (DPO) to align model behavior with human preferences.

Finally, you will learn to make your customized models efficient for real-world deployment by exploring essential optimization techniques like quantization, pruning, and knowledge distillation using TensorRT-LLM and the NeMo framework. The workshop culminates in a hands-on assessment where you will apply your new skills to align an LLM to a specific conversational style, solidifying your ability to tailor models for any application.

Register at: https://pitt.co1.qualtrics.com/jfe/form/SV_cZVnNvb3ZM7eX7E

 

Domain-Adaptive Pre-Training: Tailoring LLMs for Specialized Applications
Dec. 3, 2-4pm ET
Maximum Number of Attendees: 40

While Large Language Models (LLMs) are broadly capable, their general knowledge often falls short of the specialized, domain-specific information required for enterprise applications. This hands-on lab provides a focused, end-to-end approach for building domain-specific large language models. You'll learn how to curate domain-specific datasets, design and train custom tokenizers, and execute the pre-training process to tailor LLMs for specialized applications. You'll gain practical skills and knowledge necessary to adapt LLMs to your unique domain requirements and to real-world use cases. This course takes you on a practical journey from initial data preparation to a domain-adapted, fine-tuned model.

Register at: https://pitt.co1.qualtrics.com/jfe/form/SV_cZVnNvb3ZM7eX7E