Julia is a programming language created by Jeff Bezanson, Alan Edelman, Stefan Karpinski, and Viral B. Shah in 2009. Since its public release in 2012, Julia has received code contributions from hundreds of scientists, programmers, and engineers across the world.
Today, the Julia Lab collaborates with a variety of researchers on real-world problems and applications, while simultaneously working on the core language and its ecosystem.
- Cataloging the Visible Universe through Bayesian Inference at Petascale. Jeffrey Regier, Kiran Pamnany, Keno Fischer, Andreas Noack, Maximilian Lam, Jarrett Revels, Steve Howard, Ryan Giordano, David Schlegel, Jon McAuliffe, Rollin Thomas, Prabhat (2018), International Parallel and Distributed Processing Symposium (IPDPS), 2018, https://arxiv.org/pdf/1801.10277.pdf
- On Machine Learning and Programming Languages. Mike Innes, Stefan Karpinski, Viral Shah, David Barber, Pontus Stenetorp, Tim Besard, James Bradbury, Valentin Churavy, Simon Danisch, Alan Edelman, Jon Malmaud, Jarrett Revels, Deniz Yuret (2018), In Proceedings of SysML conference (SysML 2018), https://www.sysml.cc/doc/37.pdf
- Julia: A Fresh Approach to Numerical Computing. Jeff Bezanson, Alan Edelman, Stefan Karpinski, Viral Shah (2017), SIAM Review, 59: 65-98, http://julialang.org/publications/julia-fresh-approach-BEKS.pdf
- Portable and Productive High-Performance Computing. Ekanathan Palamadai Natarajan (2017). Ph.D. dissertation, Massachusetts Institute of Technology.
- Forward-Mode Automatic Differentiation in Julia. Jarrett Revels, Miles Lubin and Theodore Papamarkou (2016), AD2016 - 7th International Conference on Algorithmic Differentiation.
- Robust benchmarking in noisy environments. Jiahao Chen, Jarrett Revels, Alan Edelman (2016), Proceedings of the 20th Annual IEEE High Performance Extreme Computing Conference, https://arxiv.org/abs/1608.04295.
- The right way to search evolving graphs. Jiahao Chen, Weijian Zhang (2016), Proceedings of GABB’2016 - Graph Algorithms Building Blocks Workshop, http://arxiv.org/abs/1601.08189.
- Abstraction in Technical Computing. Jeffrey Werner Bezanson (2015). Ph.D. dissertation, Massachusetts Institute of Technology.
- Parallel prefix polymorphism permits parallelization, presentation & proof. Jiahao Chen, Alan Edelman (2014), HPTCDL’14 Proceedings of the 1st Workshop on High Performance Technical Computing in Dynamic Languages, 47–56, doi:10.1109/HPTCDL.2014.9, http://jiahao.github.io/parallel-prefix.
- Array operators using multiple dispatch: A design methodology for array implementations in dynamic languages. Jeff Bezanson, Jiahao Chen, Stefan Karpinski, Viral Shah, Alan Edelman (2014), ARRAY’14 Proceedings of Acm Sigplan International Workshop on Libraries, Languages, and Compilers for Array Programming, 56–61, doi:10.1145/2627373.2627383, http://arxiv.org/abs/1407.3845.
- Jarrett Revels, "Automatic Differentiation in the Julia Language." SIAM Manchester Julia Workshop 2016.
- Jarrett Revels, "Automatic Differentiation in the Julia Language." AD2016 - 7th International Conference on Algorithmic Differentiation.
- Jarrett Revels, "ForwardDiff.jl: Fast Derivatives Made Easy." JuliaCon 2016.
The Julia Lab specializes in collaborating with other groups to solve messy real-world computational problems.
Existing bioinformatics tools aren't performant enough to handle the exabytes of data produced by modern genomics research each year, and general purpose linear algebra libraries are not optimized to take advantage of this data's inherent structure. To address this problem, the Julia Lab is developing specialized algorithms for principal component analysis and statistical fitting that will enable genomics researchers to analyze data at the same rapid pace that it is produced.
This project is an exciting interdisciplinary collaboration with Dr. Stavros Papadopoulos (Senior Research Scientist at Intel Labs) and Prof. Nikolaos Patsopoulos (Assistant Professor at Brigham and Women's Hospital, the Broad Institute and Harvard Medical School).
Financial Fraud Detection
A single stock exchange generates high-frequency trading (HFT) data at a rate of ~2.2 terabytes per month. Automatic identification of suspicious financial transactions in these high-throughput HFT data streams is an active area of research. The Julia Lab contributes to the battle against financial fraud by designing out-of-core analytics for anomaly detection.
Medical Data Analytics
Hospitals, like many large organizations, collect much more data than can be usefully processed and analyzed by human experts using today's available software. Oftentimes, these small-scale analyses can overlook statistical clues that might have rendered substantial improvements to patient care.
In collaboration with Harvard Medical School, The Julia Lab has worked on tools for rapidly identifying potential indicators of irregularities in medical data, equipping doctors and healthcare providers with the analytics they need to make informed medical decisions.
Numerical Linear Algebra and Parallel Computing
The Julia Lab leads the JuliaParallel organization, which maintains the following projects:
- DistributedArrays.jl: a native Julia distributed array implementation
- MPI.jl: a wrapper for Message Passing Interface (MPI)
- ClusterManagers.jl: Julia support for different job queue systems commonly used on compute clusters
- Dagger.jl: a Dask-like framework for out-of-core and parallel computation
- Elemental.jl: a wrapper for Elemental, a distributed linear algebra/optimization library developed by Prof. Jack Poulson
- ScaLAPACK.jl: a wrapper for the Scalable Linear Algebra Package
- HDFS.jl: a wrapper for the Hadoop Distributed FileSystem
- Elly.jl: a client for Apache YARN
The Julia Lab also collaborates with Prof. Steven G. Johnson and Jared Crean in the development of PETSc.jl, a wrapper for the Portable, Extensible Toolkit for Scientific Computation.
- Prof. Alan Edelman (Principal Investigator)
- Jarrett Revels (Research Engineer)
- Prof. David P. Sanders (Visiting Professor)
- Peter Ahrens (PhD Student)
- Sung Woo Jeong (PhD Student)
- Valentin Churavy (PhD Student)
- Cooper Sloan (M.Eng.)
- Jerry Lingjie Mei (Undergraduate)
- Adelaide (Addy) Chambers (Undergraduate)
- Luana Lara (Undergraduate)
- Lin Pease (Undergraduate)
- Kelly Shen (Undergraduate)
- Mark Wang (Undergraduate)
- Prof. Ivan Slapničar (Visiting Professor, Fall 2014)
- Dr. Jeff Bezanson (PhD Student, Fall 2009-Spring2015)
- Dr. Matt Bauman (Visiting PhD Student, Fall 2015)
- Stefan Karpinski (Data Scientist, Summer 2013-Fall 2014)
- Jameson Nash (Undergraduate Student, Summer 2013)
- Weijian Zhang (Visiting Student, Spring 2016)
- Jake Bolewski (Research Engineer, Summer 2014-Spring 2016)
- Oscar Blumberg (Visiting Masters Student, Spring-Summer 2015)
- Keno Fischer (Undergraduate Student, Summer 2013 and Summer 2014)
- Runpeng Liu (Undergraduate Student, Fall 2015)
- Dr. Xianyi Zhang (Postdoctoral Associate, 2016)
- Dr. Eka Palamadai (PhD Student, Fall 2011-Spring 2016)
- Valentin Churavy (Visiting PhD Student, Summer-Fall 2016)
- Tim Besard (Visiting PhD Student, Summer-Fall 2016)
- David A. Gold (Visiting PhD Student, Summer 2016)
- Jacob Higgins (Undergraduate, Summer 2016)
- Dr. Andreas Noack (Postdoctoral Associate)
- Dr. Jiahao Chen (Research Scientist, Fall 2013-Spring 2017)
The Julia group is grateful for numerous collaborations at MIT and around the world:
- Prof. Steven G. Johnson (MIT Mathematics)
- Prof. Juan Pablo Vielma
- Dr. Jeremy Kepner (MIT Lincoln Laboratories)
- Dr. Homer Reid (MIT Mathematics)
- Dr. Alex Townsend (MIT Mathematics)
- Dr. Jean Yang (MIT CSAIL alum, Harvard Medical School)
- The JuliaOpt team team at the Operations Research Center
- Joey Huchette
- Yee Sian Ng
- Miles Lubin
- Simon Kornblith (MIT Brain and Cognitive Sciences)
- Jon Malmaud (MIT Brain and Cognitive Sciences)
- Spencer Russell (MIT Media Lab)
- Chiyuan Zhang (MIT CSAIL), author of the popular Mocha.jl framework for deep learning
- Zenna Tavares (MIT CSAIL) author of the Sigma.jl probabilistic programming environment
- Jared Crean (Rensselaer Polytechnic Institute)
We thank DARPA XDATA, the Intel Science and Technology Center for Big Data, Saudi Aramco, the MIT Institute for Soldier Nanosystems, and NIH BD2K for their generous financial support.
The Julia Lab is a member of the bigdata@CSAIL MIT Big Data Initiative and gratefully acknowledges sponsorship from the MIT EECS SuperUROP Program and the MIT UROP Office for our talented undergraduate researchers.