News
- Amazon science published an article on my experience teaching in Stanford while working in Amazon.
- I was interviewed by Jay Shah on his podcast on recommendation system, being an applied scientist and building a research career.
Research
I am an applied scientist in Amazon. My research interests are large language models, hyperparameter optimization, transfer learning, recommendation engines, scalable machine learning, and online algorithms.
In the past couple of years, I have worked on ranking and recommendation engines for Alexa Video. Prior to that, I was a research scientist at Visa Research where I worked on “transaction predictor” and “cause analysis engine” problems that focused on providing merchants with actionable insights to improve their KPIs. I received my PhD in from University of Utah where I worked on matrix approximation problems in data streams. Together with my collaborators, we showed our proposed Frequent Directions algorithm is space optimal with respect to the guaranteed error bounds it achieves. In 2017, I did a short postdoc at DIMACS Rutgers University under supervision of Muthu Muthukrishnan.
Teaching
Mining Massive Datasets (CS246) Stanford University - Winter 2022
Design and Analysis of Data Structures and Algorithms (CS513) Rutgers University - spring 2018
Randomized and Big Data Algorithms (CS 600.464) Johns Hopkins University - Fall 2016 I was a guest lecturer and taught on matrix decompositions and matrix approximations in streaming settings
Matrix Sketching Seminar (CS 7931/6961) University of Utah - Fall 2015 Lectures covered different matrix approximation techniques along with their error guarantees
Past Interns
- Sarkar Snigdha Sarathi Das - Pennsylvania State University - 2022
- Mahsa Shafaei - University of Houston - 2020
- Lan Wang - University of Illinois at Urbana-Champaign - 2019
- Zhe Xu - University of Illinois at Urbana-Champaign - 2018
Publications
Efficient Frequent Directions Algorithm for Sparse Matrices Mina Ghashami, Jeff M. Phillips, Edo Liberty. The 22nd SIGKDD Conference on Knowledge Discovery and Data Mining (KDD 2016)
Frequent Directions: Simple and Deterministic Matrix Sketching Mina Ghashami, Edo Liberty, Jeff M. Phillips, David P. Woodruff. The SIAM Journal of Computing (SICOMP 2016)
Streaming Kernel Principal Component Analysis Mina Ghashami, DannyPerry, Jeff M. Phillips. The 19th International Conference on Artificial Intelligence and Statistics (AISTATS 2016)
Improved Practical Matrix Sketching with Guarantees Mina Ghashami, Amey Desai, Jeff M. Phillips. Journal version:The IEEE Transactions on Knowledge and Data Engineering (TKDE 2016). Conference version:The 22nd European Symposium on Algorithms (ESA 2014)
Continuous Matrix Approximation on Distributed Data Mina Ghashami, Jeff M. Phillips, Feifei Li. The 40th International Conference on Very Large Data Bases (VLDB 2014)
Relative Errors for Deterministic Low-Rank Matrix Approximations Mina Ghashami, Jeff M. Phillips. The 25th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA 2014)
DLPR, A Distributed Locality Preserving Dimension Reduction Algorithm Mina Ghashami, Hoda Mashayekhi, Jafar Habibi. The 5th International Conference on Internet and Distributed Computing Systems (IDCS 2012)
Patents
Graph learning-based system with updated vectors with Fei Wang, Mahsa Shafaei, Hao Yang
Computer-implemented method, system and computer program product for group recommendation with Hao Yang, Hossein Hamooni
Academic Work
I have been on academic review committees for many conferences including KDD, DAMI Springer, SISC, STACS, and PC member of AISTATS, NIPS, ICML.