• Reference Citation Analysis
  • v
  • v
  • Find an Article
Find an Article PDF (4595231)   Today's Articles (46)   Subscriber (49331)
For: Kozachkov L, Kastanenka KV, Krotov D. Building transformers from neurons and astrocytes. Proc Natl Acad Sci U S A 2023;120:e2219150120. [PMID: 37579149 PMCID: PMC10450673 DOI: 10.1073/pnas.2219150120] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2022] [Accepted: 06/22/2023] [Indexed: 08/16/2023]  Open
Number Cited by Other Article(s)
1
Gong L, Pasqualetti F, Papouin T, Ching S. Astrocytes as a mechanism for contextually-guided network dynamics and function. PLoS Comput Biol 2024;20:e1012186. [PMID: 38820533 DOI: 10.1371/journal.pcbi.1012186] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2023] [Accepted: 05/21/2024] [Indexed: 06/02/2024]  Open
2
Won D, Lee EH, Chang JE, Nam MH, Park KD, Oh SJ, Hwang JY. The role of astrocytic γ-aminobutyric acid in the action of inhalational anesthetics. Eur J Pharmacol 2024;970:176494. [PMID: 38484926 DOI: 10.1016/j.ejphar.2024.176494] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2023] [Revised: 01/24/2024] [Accepted: 03/11/2024] [Indexed: 03/21/2024]
3
Tuckute G, Sathe A, Srikant S, Taliaferro M, Wang M, Schrimpf M, Kay K, Fedorenko E. Driving and suppressing the human language network using large language models. Nat Hum Behav 2024;8:544-561. [PMID: 38172630 DOI: 10.1038/s41562-023-01783-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2023] [Accepted: 11/10/2023] [Indexed: 01/05/2024]
4
Ellwood IT. Short-term Hebbian learning can implement transformer-like attention. PLoS Comput Biol 2024;20:e1011843. [PMID: 38277432 PMCID: PMC10849393 DOI: 10.1371/journal.pcbi.1011843] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2023] [Revised: 02/07/2024] [Accepted: 01/19/2024] [Indexed: 01/28/2024]  Open
PrevPage 1 of 1 1Next
© 2004-2024 Baishideng Publishing Group Inc. All rights reserved. 7041 Koll Center Parkway, Suite 160, Pleasanton, CA 94566, USA