Podchaser Logo
Home
#71: Alex O'Connor — Transformers, Generative AI, and the Deep Learning Revolution

#71: Alex O'Connor — Transformers, Generative AI, and the Deep Learning Revolution

Released Wednesday, 26th April 2023
Good episode? Give it some love!
#71: Alex O'Connor — Transformers, Generative AI, and the Deep Learning Revolution

#71: Alex O'Connor — Transformers, Generative AI, and the Deep Learning Revolution

#71: Alex O'Connor — Transformers, Generative AI, and the Deep Learning Revolution

#71: Alex O'Connor — Transformers, Generative AI, and the Deep Learning Revolution

Wednesday, 26th April 2023
Good episode? Give it some love!
Rate Episode

Alex O'Connor — Transformers, Generative AI, and the Deep Learning Revolution

Alex O’Connor—researcher and ML manager—on the latest trends of generative AI. Language and image models, prompt engineering, the latent space, fine-tuning, tokenization, textual inversion, adversarial attacks, and more.


Alex O’Connor got his PhD in Computer Science from Trinity College, Dublin. He was a postdoctoral researcher and funded investigator for the ADAPT Centre for digital content, at both TCD and later DCU. In 2017, he joined Pivotus, a Fintech startup, as Director of Research. Alex has been Sr Manager for Data Science & Machine Learning at Autodesk for the past few years, leading a team that delivers machine learning for e-commerce, including personalization and natural language processing.


Favorite quotes

  • “None of these models can read.”
  • “Art in the future may not be good, but it will be prompt.” Mastodon

Books

Papers

Links

Machine learning models

Sites

Concepts

People mentioned

Chapters

  • 00:00 · Introduction
  • 00:40 · Machine learning
  • 02:36 · Spam and scams
  • 15:57 · Adversarial attacks
  • 20:50 · Deep learning revolution
  • 23:06 · Transformers
  • 31:23 · Language models
  • 37:09 · Zero-shot learning
  • 42:16 · Prompt engineering
  • 43:45 · Training costs and hardware
  • 47:56 · Open contributions
  • 51:26 · BERT and Stable Diffusion
  • 54:42 · Tokenization
  • 59:36 · Latent space
  • 01:05:33 · Ethics
  • 01:10:39 · Fine-tuning and pretrained models
  • 01:18:43 · Textual inversion
  • 01:22:46 · Dimensionality reduction
  • 01:25:21 · Mission
  • 01:27:34 · Advice for beginners
  • 01:30:15 · Books and papers
  • 01:34:17 · The lab notebook
  • 01:44:57 · Thanks


I'd love to hear from you.

Submit a question about this or any previous episodes.

Join the Discord community. Meet other curious minds.

If you enjoy the show, would you please consider leaving a short review on Apple Podcasts/iTunes? It takes less than 60 seconds and really helps.

Show notes, transcripts, and past episodes at gettingsimple.com/podcast.

Thanks to Andrea Villalón Paredes for editing this interview.
Sleep and A Loop to Kill For songs by Steve Combs under CC BY 4.0.


Follow Nono

Twitter.com/nonoesp

Instagram.com/nonoesp

Facebook.com/nonomartinezalonso

YouTube.com/nonomartinezalonso

Show More
Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features