Изменения

Перейти к навигации Перейти к поиску
м
Строка 43: Строка 43:  
|Разговор
 
|Разговор
 
|GPT-3, {{Не переведено 3|Тонкая настройка (нейросети)|тонко настроенный|en|Fine-tuning (machine learning)}} для выполнения инструкций с использованием [[Обучение с подкреплением на основе отзывов людей|обратной связи с человеком]].
 
|GPT-3, {{Не переведено 3|Тонкая настройка (нейросети)|тонко настроенный|en|Fine-tuning (machine learning)}} для выполнения инструкций с использованием [[Обучение с подкреплением на основе отзывов людей|обратной связи с человеком]].
|175 миллиардов<ref>{{cite journal|last1=Ouyang|first1=Long|last2=Wu|first2=Jeff|last3=Jiang|first3=Xu|last4=Almeida|first4=Diogo|last5=Wainwright|first5=Carroll L.|last6=Mishkin|first6=Pamela|last7=Zhang|first7=Chong|last8=Agarwal|first8=Sandhini|last9=Slama|first9=Katarina|last10=Ray|first10=Alex|last11=Schulman|first11=John|last12=Hilton|first12=Jacob|last13=Kelton|first13=Fraser|last14=Miller|first14=Luke|last15=Simens|first15=Maddie|last16=Askell|first16=Amanda|last17=Welinder|first17=Peter|last18=Christiano|first18=Paul|last19=Leike|first19=Jan|last20=Lowe|first20=Ryan|title=Training language models to follow instructions with human feedback|date=4 March 2022|arxiv=2203.02155|display-authors=3}}</ref>
+
|175 миллиардов<ref>{{cite journal|last1=Ouyang|first1=Long|last2=Wu|first2=Jeff|last3=Jiang|first3=Xu|last4=Almeida|first4=Diogo|last5=Wainwright|first5=Carroll L.|last6=Mishkin|first6=Pamela|last7=Zhang|first7=Chong|last8=Agarwal|first8=Sandhini|last9=Slama|first9=Katarina|last10=Ray|first10=Alex|last11=Schulman|first11=John|last12=Hilton|first12=Jacob|last13=Kelton|first13=Fraser|last14=Miller|first14=Luke|last15=Simens|first15=Maddie|last16=Askell|first16=Amanda|last17=Welinder|first17=Peter|last18=Christiano|first18=Paul|last19=Leike|first19=Jan|last20=Lowe|first20=Ryan|title=Training language models to follow instructions with human feedback|date=2022-03-04|arxiv=2203.02155|display-authors=3}}</ref>
 
|{{dunno}}
 
|{{dunno}}
 
|4 марта 2022
 
|4 марта 2022
Строка 91: Строка 91:  
|-
 
|-
 
|[[BioGPT]]
 
|[[BioGPT]]
|[[Биомедицина|Биомедицинский]] контент<ref name="pmid36156661">{{cite journal|author=Luo R, Sun L, Xia Y, Qin T, Zhang S, Poon H|display-authors=etal|date=24 September 2022|title=BioGPT: generative pre-trained transformer for biomedical text generation and mining.|url=https://www.ncbi.nlm.nih.gov/entrez/eutils/elink.fcgi?dbfrom=pubmed&tool=sumsearch.org%2Fcite&retmode=ref&cmd=prlinks&id=36156661|journal=Brief Bioinform|volume=23|issue=6|pages=|doi=10.1093/bib/bbac409|pmc=|pmid=36156661|access-date=7 February 2023|archive-date=1 April 2023|archive-url=https://web.archive.org/web/20230401192557/https://academic.oup.com/bib/article-abstract/23/6/bbac409/6713511?redirectedFrom=fulltext}}</ref><ref>{{cite web|url=https://the-decoder.com/biogpt-is-a-microsoft-language-model-trained-for-biomedical-tasks/|title=BioGPT is a Microsoft language model trained for biomedical tasks|author=Matthias Bastian|website=The Decoder|date=2023-01-29|archive-url=https://web.archive.org/web/20230207174627/https://the-decoder.com/biogpt-is-a-microsoft-language-model-trained-for-biomedical-tasks/|archive-date=2023-02-07|access-date=2023-02-07}}</ref>
+
|[[Биомедицина|Биомедицинский]] контент<ref name="pmid36156661">{{cite journal|author=Luo R, Sun L, Xia Y, Qin T, Zhang S, Poon H|display-authors=etal|date=2022-09-24|title=BioGPT: generative pre-trained transformer for biomedical text generation and mining.|url=https://www.ncbi.nlm.nih.gov/entrez/eutils/elink.fcgi?dbfrom=pubmed&tool=sumsearch.org%2Fcite&retmode=ref&cmd=prlinks&id=36156661|journal=Brief Bioinform|volume=23|issue=6|pages=|doi=10.1093/bib/bbac409|pmc=|pmid=36156661|access-date=2023-02-07|archive-date=1 April 2023|archive-url=https://web.archive.org/web/20230401192557/https://academic.oup.com/bib/article-abstract/23/6/bbac409/6713511?redirectedFrom=fulltext}}</ref><ref>{{cite web|url=https://the-decoder.com/biogpt-is-a-microsoft-language-model-trained-for-biomedical-tasks/|title=BioGPT is a Microsoft language model trained for biomedical tasks|author=Matthias Bastian|website=The Decoder|date=2023-01-29|archive-url=https://web.archive.org/web/20230207174627/https://the-decoder.com/biogpt-is-a-microsoft-language-model-trained-for-biomedical-tasks/|archive-date=2023-02-07|access-date=2023-02-07}}</ref>
 
|Как в [[GPT-2]] Medium (24 слоев, 16 головок)
 
|Как в [[GPT-2]] Medium (24 слоев, 16 головок)
 
|347 миллионов
 
|347 миллионов
Анонимный участник

Реклама:

Навигация