##plugins.themes.bootstrap3.accessible_menu.label##
##plugins.themes.bootstrap3.accessible_menu.main_navigation##
##plugins.themes.bootstrap3.accessible_menu.main_content##
##plugins.themes.bootstrap3.accessible_menu.sidebar##
Publisher Home
Register
Login
ISSN: 2736-5492 (Online)
Home
About
About the Journal
Editorial Team
Article Processing Fee
Policies
Privacy Statement
Crossmark Policy
Copyright Statement
GDPR Policy
Open Access Policy
Publication Ethics Statement
Current
Archives
Author Guidelines
Announcements
Subscribe
Submission
Vol. 5 No. 2 (2025)
Home
Archives
Vol. 5 No. 2 (2025)
List of Accepted Papers
Articles
Articles
Domain-Adaptive Pretraining of Transformer-Based Language Models on Medical Texts: A High-Performance Computing Experiment
Charles Kinyua Gitonga, Lydia Gakii Mugao
Abstract views: 80
Downloads: 21
10.24018/compute.2025.5.2.149
1-9
PDF
HTML
EPUB
JATS XML