The structured data that powers business decisions is more complex than the sequences processed by traditional AI models. Enterprise databases with their interconnected tables of customers, products, and transactions form intricate graphs that contain valuable predictive signals. But how can we effectively extract insights from these complex relationships without extensive manual feature engineering?
Graph transformers are revolutionizing this space by treating databases as networks and learning directly from raw data. What if you could build models in hours instead of months while achieving better accuracy? How might this technology change the role of data scientists, allowing them to focus on business impact rather than data preparation? Could this be the missing piece that brings the AI revolution to predictive modeling?
Jure Leskovec is a Professor of Computer Science at Stanford University, where he is affiliated with the Stanford AI Lab, the Machine Learning Group, and the Center for Research on Foundation Models.
Previously, he served as Chief Scientist at Pinterest and held a research role at the Chan Zuckerberg Biohub. He is also a co-founder of Kumo.AI, a machine learning startup. Leskovec has contributed significantly to the development of Graph Neural Networks and co-authored PyG, a widely-used library in the field. Research from his lab has supported public health efforts during the COVID-19 pandemic and informed product development at companies including Facebook, Pinterest, Uber, YouTube, and Amazon.
His work has received several recognitions, including the Microsoft Research Faculty Fellowship (2011), the Okawa Research Award (2012), the Alfred P. Sloan Fellowship (2012), the Lagrange Prize (2015), and the ICDM Research Contributions Award (2019). His research spans social networks, machine learning, data mining, and computational biomedicine, with a focus on drug discovery. He has received 12 best paper awards and five 10-year Test of Time awards at leading academic conferences.
In the episode, Richie and Jure explore the need for a foundation model for enterprise data, the limitations of current AI models in predictive tasks, the potential of graph transformers for business data, and the transformative impact of relational foundation models on machine learning workflows, and much more.
Links Mentioned in the Show:
New to DataCamp?
En liten tjänst av I'm With Friends. Finns även på engelska.