Source: US10452978B2 – Attention-based sequence transduction neural networks – Google Patents
Summary
This system uses attention-based encoder and decoder neural networks to transduce an input sequence into an output sequence, which is faster and more accurate compared to existing models based on recurrent or convolutional layers.
Fact
💻 The neural network system is implemented as computer programs on one or more computers in one or more locations.
Source: Web Scraping with GPT-4
Summary
🤖 Kadoa is experimenting with autonomous web scraping at scale using LLMs, GPT 3.5, and GPT-4. They are considering releasing the technology under an open-source license if there is enough interest.
Facts
Source: How does data removal from GPT *work
Summary
OpenAI offers a form for data removal from their models, but it is unclear how effective it will be as removing data from a trained language model is not a straightforward process.
Fact
Source: Lotus 1-2-3
Summary
The author enjoys using the Lotus 1-2-3 software for work, but it has limited resolution and makes it difficult to work on modern terminals. They researched a potential solution involving custom display drivers from an old Epson Endeavor PC but no concrete solution is found.
Facts
Source: Read and discuss the latest engineering books with other professionals
Summary
A Philosophy of Software Design is a book that teaches how to manage complexity in software design. It presents a collection of design principles and red flags, helping readers to minimize complexity and write software more efficiently.
Fact
📘 The book addresses the topic of software design and how to decompose complex software systems into modules. 📚 It introduces the fundamental problem in software design, which is managing complexity. 🧐 The book presents a set of design principles to apply during software design and identifies red flags that identify design problems. 💻 The ideas presented in the book help to minimize the complexity of large software systems, making it possible to write software more quickly and cheaply.