A regular week.
- [2304.11062] Scaling Transformer to 1M tokens and beyond with RMT: "This technical report presents the application of a recurrent memory to extend the context length of BERT, one of the most effective Transformer-based models in natural language processing. By leveraging the Recurrent Memory Transformer architecture, we have successfully increased the model's effective context length to an unprecedented two million tokens, while maintaining high memory retrieval accuracy. Our method allows for the storage and processing of both local and global information and enables information flow between segments of the input sequence through the use of recurrence. Our experiments demonstrate the effectiveness of our approach, which holds significant potential to enhance long-term dependency handling in natural language understanding and generation tasks as well as enable large-scale context processing for memory-intensive applications."
- The Ravenna Ultra-Low-Altitude Vehicle: A Rocketship Treehouse: Someone built a rocket in their garden for people to play in. Looks awesome.
- Kaffeost - Gastro Obscura: Putting cheese into coffee, what a great idea!
- Exciting SQLite Improvements Since 2020
Libraries, programming, etc
- GitHub - oliveirabruno01/babyagi-asi: BabyAGI: an Autonomous and Self-Improving agent, or BASI
- GitHub - mlc-ai/mlc-llm: Enable everyone to develop, optimize and deploy AI models natively on everyone's devices.
- The Crimson Petal and the White by Michel Faber: This is on my reading list, and it is slightly meta. Apart from that, I don't know why I'm reading this.
- Meet Britain’s Answer To The Tesla Cybertruck! - YouTube
- Why Are Bows That Shape? - YouTube: Various considerations of efficiency, ship width and buoyancy.
- Why Do Lighthouses Have Red Stripes? - YouTube: So you can see them during the day.
- Exxon Valdez: Why Didn't She Turn? - YouTube: Multiple failures on all kinds of levels.