SANDS lab

We develop techniques and algorithms for building and managing key networked systems that are worthy of society’s trust. Our core interests lie in improving the modern computing environment where distributed systems and computer networks are a pervasive component.

  • We focus on bridging the gap between the abstractions that users (i.e., software developers, cloud providers, or network operators) need and what a performant, scalable, dependable and deployable system can achieve in practice.
  • We build prototypes that directly improve the lives of real users.
  • We seek solutions based on theoretically grounded arguments while also gaining insights into constraints and trade-offs in the design space.

Our goal is to enrich the human knowledge of how to build future-proof systems that can stand the test of time.

News

  • Aug'23: Chen-Yu (Elton) has defended his PhD thesis titled “Tackling the Communication Bottlenecks of Distributed Deep Learning Training Workloads” and will next join Bytedance (USA). Congratulations!
  • Mar'23: Arnaud has defended his PhD thesis titled “Verification and Privacy Techniques for Improving the Trustworthiness of Neural Networks” and will next join Nokia Bell Labs. Congratulations!
  • LineFS wins the Best Paper Award at SOSP'21.
  • Rethinking gradient sparsification as total error minimization accepted as spotlight paper (top 3%) at NeurIPS'21.
  • We organize a tutorial on Network-Accelerated Distributed Deep Learning at SIGCOMM'21.
  • In our NSDI'21 paper we demonstrated how to accelerate distributed ML via in-network aggregation with SwitchML. In our upcoming SIGCOMM'21 paper introducing OmniReduce, we are advancing streaming aggregation to leverage the sparsity of large models’ gradient vectors to accelerate training.
  • In the GRACE project, we survey popular gradient compression techniques for distributed deep learning and perform a comprehensive comparative evaluation. Read our ICDCS'21 paper.

Contact

Projects

CoLExT

Collaborative Learning Experimentation Testbed

SIDCo

An Efficient Statistical-Based Gradient Compression Technique for Distributed Training Systems

OmniReduce

Efficient Sparse Collective Communication

GRACE

GRAdient ComprEssion for distributed deep learning

FairFL

A Systems Approach to Tackling Fairness in Federated Learning

DC2

Delay-aware Communication Control for Distributed ML

SwitchML

Scaling Distributed Machine Learning with In-Network Aggregation

DAIET

In-Network Computation is a Dumb Idea Whose Time Has Come

Previous major projects focusing on SDN and programmable networks include:

Group

Faculty

Avatar

Marco Canini

Associate Professor of Computer Science

Distributed Systems, Networking, Machine Learning, Cloud Computing

Research Staff

Avatar

Amandio Faustino

Research Software Engineer

Avatar

Mubarak Ojewale

Postdoc

Students

Avatar

Achref Rebai

MS/PhD Student

Avatar

Boris Radovic

PhD Student

Avatar

Jihao Xin

PhD Student

Avatar

Juyi Lin

MS/PhD Student

Avatar

Mohammed K. Aljahdali

PhD Student

Avatar

Norah Alballa

PhD Student

Avatar

Salma Kharrat

PhD Student

Avatar

Tongzhou Gu

MS/PhD Student

Avatar

Vladyslav Shumanskyy

PhD Student

Alumni

Avatar

Ahmed M. Abdelmoniem Sayed

Alumni

Postdoc 2019, Research Scientist 2020-2021, now Assistant Professor at QMUL

Avatar

Amedeo Sapio

Alumni

Postdoc 2018-19, now Software Engineer at Intel

Avatar

Arnaud Dethise

PhD Student

PhD 2023, now Research Scientist at Nokia Bell Labs

Avatar

Atal Sahu

Alumni

MS 2020, now Data Scientist at Regology

Avatar

Chen-Yu Ho

PhD Student

PhD 2023, joining Bytedance (USA)

Avatar

Dan Levin

Alumni

PhD 2014, co-founder and CEO of Stacktile GmbH

Avatar

Fatimah Zohra

Alumni

MS 2020, now PhD Student at KAUST

Avatar

Hassan Alsibyani

Alumni

MS 2018, now Technical Lead at Wasphi

Avatar

Jiawei Fei

Alumni

PhD with the sponsorship from China Scholarship Council (CSC) 2021

Avatar

Lalith Suresh

Alumni

PhD 2016, now Researcher at VMware Research

Avatar

Marco Chiesa

Alumni

Postdoc 2015-2017, now Associate Professor at KTH

Avatar

M. Bilal

Alumni

PhD 2022, now Senior Engineer at Unbabel

Avatar

Omar Alama

Alumni

Research Software Engineer 2020-21, now MSc in CE student at CMU

Avatar

Omar Zawawi

Alumni

MS 2023, now Software Engineer at Mozn

Avatar

Thanh Dang Nguyen

Alumni

Postdoc 2015-16, now Research Engineer at University of Chicago

Avatar

Waleed Reda

Alumni

PhD 2022, now Postdoctoral Researcher at Microsoft Research

Avatar

Yousef Alowayed

Alumni

MS 2018, now Software Engineer at Google

Open Positions

I’m always looking for bright and enthusiastic people to join my group. If you are looking to do a PhD with me, thank you for your interest, but please read this first. If you don’t I will know, and I’m afraid I will have to ignore your message.