Accelerating Computation through a marriage of
Computer Science & Computational Science

Projects (MEng/UROP)

If you are interested in any of these projects and are a current MIT student looking for a UROP or MEng please reach out to the mentor listed next to project.

Methods in Scientific Machine Learning

A large list of projects in scientific machine learning can be found here. Take that list as a set of ideas from which larger projects cna be chosen.

Julia Compiler/Runtime for HPC

Mentor: Valentin Churavy

We have many projects for working on compiler or runtimes in the context of scientific computing, the topics below can serve as inspiration.

Accelerated computing

KernelAbstractions.jl

KernelAbstractions.jl provides a common interface for writing GPU kernels in Julia and executing them on multiple platforms.

AMDGPU.jl

Mentor: Julian Samaroo

Compiler based automatic-differentiation – Enzyme.jl

Enzyme.jl is the Julia frontend to the Enzyme automatic-differentiation engine.

General Julia compiler infrastructure

CESMIX

Accelerate learning by automatically reducing the size of the training dataset.

Feasibility study on reducing the size of an a-HfO2 dataset using a parallel method based on HDBSCAN and ACE. A parallel Julia implementation of a state of the art method will be required as well as the proposal of an improved version aligned to CESMIX objectives. Description here. Contact: Emmanuel Lujan (eljn AT mit DOT edu)