Advertisement
argui

backstory gpt

Feb 22nd, 2023
167
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 1.37 KB | None | 0 0
  1. backstory: it began with the development of a neural network ((maybe lets explain what a neural network is before)) called "the transformer". it was introduced in a research paper by google in 2017 and it got the best availible neural network for natural language processing. the good thing about it was his abbility to process long text which was a huge improvements to other neural network architectures at this time.
  2. in 2018 openAI ((lets shortly mention what openAI is)) released the first version of GPT (=generative pre-trained transformer), which was based on transformer but more effective. gpt-1 was trained on a huge dataset of websites, books and was for its time revolutionizing
  3. in 2019 openAi released the second version of GPT, GPT-2. ITs dataset was more huge and it had 10 times as many parameters, therefore it was way better in creating more realistic text. openAI feared the misuse of this powerful language model so they made the choice to not release it to the public. instead they downgraded it and published the downgraded version of it.
  4. in 2020 openAI released GPT-3, which is until now the largest and most powerful language model. it has 175 billion parameters and was trained on a massive dataset of websites, books and other text sources. it showed remarkable abilities in natural language processing, including rext completion, translation and even question answering.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement