🎆
ALL-ABOUT-LLM
  • 🥳Summary
  • 🐰TOTOTOlearn
    • ⚒️Evaluate
    • 🙇‍♂️Lossless Compression
      • 🧠Compressing for AGI
    • 👼Tutorials&Workshops
      • 🥰Introduction to Algorithms
        • 🐶The Role of Algorithms in Computing
      • 🧎‍♀️ACL 2023 Tutorial: Retrieval-based Language Models and Applications
      • 🐧Real_PEFT
    • 🧕Personality Traits&Bias in LLM
      • 👹Personality Traits in Large Language Models
    • 🔥openLLM
      • 🦅Aquila 悟道天鹰系列
      • 🌊百川大模型
    • 🌜AI Agents
      • 🍞Learning to Model the World with Language
      • 🪵ToolLLM
    • 👾MLLM
      • 🐐LLaVA: Large Language and Vision Assistant
    • 📃Surveys
    • 🙇‍♀️POSTS
      • 🔆拆解追溯 GPT-3.5 各项能力的起源
  • 🤖TOTOTODO
    • 🌚Challenges and Applications of Large Language Models
      • 😼Open Problems and Fundamental Limitations of Reinforcement Learning from Human Feedback
      • 👏Challenges and Applications of Large Language Models
      • 🙆模型安全--A Watermark for Large Language Models
      • 🏸Inword: Bring games to life with AI NPCs
      • 💦Brain in a Vat: On Missing Pieces Towards Artificial General Intelligence in Large Language Models
    • 😃炼丹工具箱
      • 🌐Megatron-LM -nvidia
      • 🌸Colossal-AI: 让AI大模型更低成本、方便易用、高效扩展
      • 🙆‍♂️BMInf -- 一个用于PLM推理阶段的低资源工具包
      • 🦈LLaMA-Efficient-Tuning&text-generation-webui
      • 🪐Paramters and Definations
      • 🦙Chinese-Vicuna: A Chinese Instruction-following LLaMA-based Model —— 一个中文低资源的llama+lora方案
      • 🥳PEFT doc-cn
        • 🥳Quickstart
Powered by GitBook
On this page
  • Theme of talk
  • Takeaways
  • Minimum Description Length
  • ...and why it relates to perception
  1. TOTOTOlearn
  2. Lossless Compression

Compressing for AGI

Jack Rae

PreviousLossless CompressionNextTutorials&Workshops

Last updated 1 year ago

相关解读:

原视频:

Theme of talk

  • Think deeply about the training objective of foundation models

  • What are we doing , why dose it make sense, and what are the limitations?

Takeaways

  • seek the minimum description length to solve perception

  • Generative models are lossless compressors

  • Large language models are state-of-the-art lossless text compressors(?!)

  • Current limitation of thr apporach

Minimum Description Length

...and why it relates to perception

We want to deepest understanding of our observations

ones that generalize

🐰
🙇‍♂️
🧠
https://zhuanlan.zhihu.com/p/619511222
https://www.youtube.com/watch?v=dO4TPJkeaaU&t=161s