Julia is a programming language created by Jeff Bezanson, Alan Edelman, Stefan Karpinski, and Viral B. Shah in 2009. Since its public release in 2012, Julia has received code contributions from hundreds of scientists, programmers, and engineers across the world.
Today, a small group of researchers reside at MIT and focus on theoretical and numerical aspects of the core Julia language, base library, and several other packages. Current activities center around data science applications, numerical linear algebra, parallel computing, and type theory.
The Julia Lab specializes in collaborating with other groups to solve messy real-world computational problems.
Existing bioinformatics tools aren't performant enough to handle the exabytes of data produced by modern genomics research each year, and general purpose linear algebra libraries are not optimized to take advantage of this data's inherent structure. To address this problem, the Julia Lab is developing specialized algorithms for principal component analysis and statistical fitting that will enable genomics researchers to analyze data at the same rapid pace that it is produced.
This project is an exciting interdisciplinary collaboration with Dr. Stavros Papadopoulos (Senior Research Scientist at Intel Labs) and Prof. Nikolaos Patsopoulos (Assistant Professor at Brigham and Women's Hospital, the Broad Institute and Harvard Medical School).
Financial Fraud Detection
A single stock exchange generates high-frequency trading (HFT) data at a rate of ~2.2 terabytes per month. Automatic identification of suspicious financial transactions in these high-throughput HFT data streams is an active area of research. The Julia Lab contributes to the battle against financial fraud by designing out-of-core analytics for anomaly detection.
Medical Data Analytics
Hospitals, like many large organizations, collect much more data than can be usefully processed and analyzed by human experts using today's available software. Oftentimes, these small-scale analyses can overlook statistical clues that might have rendered substantial improvements to patient care.
In collaboration with Harvard Medical School, The Julia Lab has worked on tools for rapidly identifying potential indicators of irregularities in medical data, equipping doctors and healthcare providers with the analytics they need to make informed medical decisions.
Numerical Linear Algebra and Parallel Computing
The Julia Lab leads the JuliaParallel organization, which maintains the following projects:
- DistributedArrays.jl: a native Julia distributed array implementation
- MPI.jl: a wrapper for Message Passing Interface (MPI)
- ClusterManagers.jl: Julia support for different job queue systems commonly used on compute clusters
- Dagger.jl: a Dask-like framework for out-of-core and parallel computation
- Elemental.jl: a wrapper for Elemental, a distributed linear algebra/optimization library developed by Prof. Jack Poulson
- ScaLAPACK.jl: a wrapper for the Scalable Linear Algebra Package
- HDFS.jl: a wrapper for the Hadoop Distributed FileSystem
- Elly.jl: a client for Apache YARN
The Julia Lab also collaborates with Prof. Steven G. Johnson and Jared Crean in the development of PETSc.jl, a wrapper for the Portable, Extensible Toolkit for Scientific Computation.
- Portable and Productive High-Performance Computing. Ekanathan Palamadai Natarajan (2017). Ph.D. dissertation, Massachusetts Institute of Technology.
- Forward-Mode Automatic Differentiation in Julia. Jarrett Revels, Miles Lubin and Theodore Papamarkou (2016), AD2016 - 7th International Conference on Algorithmic Differentiation.
- Robust benchmarking in noisy environments. Jiahao Chen, Jarrett Revels, Alan Edelman (2016), Proceedings of the 20th Annual IEEE High Performance Extreme Computing Conference, https://arxiv.org/abs/1608.04295.
- The right way to search evolving graphs. Jiahao Chen, Weijian Zhang (2016), Proceedings of GABB’2016 - Graph Algorithms Building Blocks Workshop, http://arxiv.org/abs/1601.08189.
- Parallel prefix polymorphism permits parallelization, presentation & proof. Jiahao Chen, Alan Edelman (2014), HPTCDL’14 Proceedings of the 1st Workshop on High Performance Technical Computing in Dynamic Languages, 47–56, doi:10.1109/HPTCDL.2014.9, http://jiahao.github.io/parallel-prefix.
- Array operators using multiple dispatch: A design methodology for array implementations in dynamic languages. Jeff Bezanson, Jiahao Chen, Stefan Karpinski, Viral Shah, Alan Edelman (2014), ARRAY’14 Proceedings of Acm Sigplan International Workshop on Libraries, Languages, and Compilers for Array Programming, 56–61, doi:10.1145/2627373.2627383, http://arxiv.org/abs/1407.3845.
- Jarrett Revels, "Automatic Differentiation in the Julia Language." SIAM Manchester Julia Workshop 2016.
- Jarrett Revels, "Automatic Differentiation in the Julia Language." AD2016 - 7th International Conference on Algorithmic Differentiation.
- Jarrett Revels, "ForwardDiff.jl: Fast Derivatives Made Easy." JuliaCon 2016.
- Prof. Alan Edelman (Principal Investigator)
- Dr. Jiahao Chen (Research Scientist)
- Dr. Andreas Noack (Postdoctoral Associate)
- Jarrett Revels (Research Engineer)
- Prof. David P. Sanders (Visiting Professor)
- Tim Besard (Visiting PhD Student)
- Adelaide (Addy) Chambers (Undergraduate)
- Luana Lara (Undergraduate)
- Lin Pease (Undergraduate)
- Kelly Shen (Undergraduate)
- Cooper Sloan (Undergraduate)
- Mark Wang (Undergraduate)
- Prof. Ivan Slapničar (Visiting Professor, Fall 2014)
- Dr. Jeff Bezanson (PhD Student, Fall 2009-Spring2015)
- Dr. Matt Bauman (Visiting PhD Student, Fall 2015)
- Stefan Karpinski (Data Scientist, Summer 2013-Fall 2014)
- Jameson Nash (Undergraduate Student, Summer 2013)
- Weijian Zhang (Visiting Student, Spring 2016)
- Jake Bolewski (Research Engineer, Summer 2014-Spring 2016)
- Oscar Blumberg (Visiting Masters Student, Spring-Summer 2015)
- Keno Fischer (Undergraduate Student, Summer 2013 and Summer 2014)
- Runpeng Liu (Undergraduate Student, Fall 2015)
- Dr. Xianyi Zhang (Postdoctoral Associate, 2016)
- Dr. Eka Palamadai (PhD Student, Fall 2011-Spring 2016)
- Valentin Churavy (Visiting PhD Student, Summer-Fall 2016)
- David A. Gold (Visiting PhD Student, Summer 2016)
- Jacob Higgins (Undergraduate, Summer 2016)
The Julia group is grateful for numerous collaborations at MIT and around the world:
- Prof. Steven G. Johnson (Mathematics)
- Prof. Juan Pablo Vielma
- Dr. Jeremy Kepner (Lincoln Laboratories)
- Dr. Homer Reid (Mathematics)
- Dr. Alex Townsend (Mathematics)
- Dr. Jean Yang (CSAIL alum, Harvard Medical School)
- The JuliaOpt team team at the Operations Research Center
- Joey Huchette
- Yee Sian Ng
- Miles Lubin
- Simon Kornblith (Brain and Cognitive Sciences)
- Jon Malmaud (Brain and Cognitive Sciences)
- Spencer Russell (Media Lab)
- Chiyuan Zhang (CSAIL), author of the popular Mocha.jl framework for deep learning
- Zenna Tavares (CSAIL) author of the Sigma.jl probabilistic programming environment
- Jared Crean (Rensselaer Polytechnic Institute)
We thank DARPA XDATA, the Intel Science and Technology Center for Big Data, Saudi Aramco, the MIT Institute for Soldier Nanosystems, and NIH BD2K for their generous financial support.
The Julia Lab is a member of the bigdata@CSAIL MIT Big Data Initiative and gratefully acknowledges sponsorship from the MIT EECS SuperUROP Program and the MIT UROP Office for our talented undergraduate researchers.