Mostrando postagens com marcador english. Mostrar todas as postagens
Mostrando postagens com marcador english. Mostrar todas as postagens

Papers: GPT & Transformers in Machine Learning

 Some papers about "transformers" and GPT

"Attention is All You Need"; Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin . Abstract: "The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely. Experiments on two machine translation tasks show these models to be superior in quality while being more parallelizable and requiring significantly less time to train. Our model achieves 28.4 BLEU on the WMT 2014 English-to-German translation task, improving over the existing best results, including ensembles by over 2 BLEU. On the WMT 2014 English-to-French translation task, our model establishes a new single-model state-of-the-art BLEU score of 41.8 after training for 3.5 days on eight GPUs, a small fraction of the training costs of the best models from the literature. We show that the Transformer generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data."  


"Language Models are Unsupervised Multitask Learners"; Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever.   Abstract: "Natural language processing tasks, such as question answering, machine translation, reading comprehension, and summarization, are typically approached with supervised learning on taskspecific datasets. We demonstrate that language models begin to learn these tasks without any explicit supervision when trained on a new dataset of millions of webpages called WebText. When conditioned on a document plus questions, the answers generated by the language model reach 55 F1 on the CoQA dataset - matching or exceeding the performance of 3 out of 4 baseline systems without using the 127,000+ training examples. The capacity of the language model is essential to the success of zero-shot task transfer and increasing it improves performance in a log-linear fashion across tasks. Our largest model, GPT-2, is a 1.5B parameter Transformer that achieves state of the art results on 7 out of 8 tested language modeling datasets in a zero-shot setting but still underfits WebText. Samples from the model reflect these improvements and contain coherent paragraphs of text. These findings suggest a promising path towards building language processing systems which learn to perform tasks from their naturally occurring demonstrations."



For sugestion paper or general text in this topic type in comment in this post.

Thank you very much. 

Electron - Jasvascript App Creating and Deploying Applications


 It is possible using NodeJS, NPM and Electron to build portable desktop applications using Javascript, HTML5 and CSS3 technologies. For this it is necessary to install NodeJS with NPM. After this installation, using npm, Electron is installed.

$ npm i -D electron@latest


After installation when creating projects it is possible to empot it in a binary to distribute it to Linux, Windows systems in a single executable file using (at the moment of this post) the command


$ npx create-electron-app my-app


The above command will generate a directory with application files, to package it we will use,


$ npx run make

 

In the project directory will appear a folder named out in this will be the application binaries.


References

  

Create And Distribution Electron APP Quick Tutorials https://www.electronjs.org/docs/latest/tutorial/quick-start

NodeJS - page to get NodeJS and see documentation https://nodejs.org/en/

ElectronJS - https://www.electronjs.org/ Documentation and examples, source code

ElectronJS Documentation - Application Distribution in Binaries

ElectronForge - tool for creating, publishing, and installing modern Electron applications

Electron Builds