{"id":200,"date":"2023-07-13T11:27:07","date_gmt":"2023-07-13T11:27:07","guid":{"rendered":"https:\/\/34.239.202.173\/?p=200"},"modified":"2024-02-01T15:31:02","modified_gmt":"2024-02-01T15:31:02","slug":"revolutionize-longnet-transformers-to-1000000000-tokens","status":"publish","type":"post","link":"https:\/\/mlnews.dev\/revolutionize-longnet-transformers-to-1000000000-tokens\/","title":{"rendered":"REVOLUTIONIZE: LONGNET – Epic Transformers To 1,000,000,000 Tokens"},"content":{"rendered":"\n
Introducing LONGNET, a revolutionary language model that supports one billion tokens, while maintaining system performance and quality. LONGNET is revolutionizing long sequences, along with linear computation for optimization. This research was first published at Microsoft Research,<\/strong> with the help of different researchers such as Jiayu Ding<\/a>, Shuming Ma<\/a>, Li Dong<\/a>, Xingxing Zhang<\/a>, Shaohan Huang<\/a>, Wenhui Wang<\/a>, and Furu Wei<\/a>. This mechanism is allowing the system to gain optimization over a long sequence of text. Research has found a new way to handle billions of tokens. This approach can model extremely long pieces of tokens.<\/p>\n\n\n\n