Publications
Publications resulting from research conducted using Delta appear here. Check back to see how the list of exciting discoveries made using Delta grows.
If you have a publication that should be listed here and isn’t, please share your success with us!
5005740
XHSH9DGT
1
nature
50
default
1
1693
https://delta.ncsa.illinois.edu/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A200%2C%22request_next%22%3A50%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22JU9X74NV%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Richards%20et%20al.%22%2C%22parsedDate%22%3A%222025%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ERichards%2C%20C.%2C%20Dima%2C%20A.%2C%20Ferguson%2C%20D.%20%26amp%3B%20Witek%2C%20H.%20Growing%20black-hole%20hair%20in%20nonminimally%20coupled%20biscalar%20gravity.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2501.14034%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2501.14034%3C%5C%2Fa%3E%20%282025%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Growing%20black-hole%20hair%20in%20nonminimally%20coupled%20biscalar%20gravity%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chloe%22%2C%22lastName%22%3A%22Richards%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexandru%22%2C%22lastName%22%3A%22Dima%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Deborah%22%2C%22lastName%22%3A%22Ferguson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Helvi%22%2C%22lastName%22%3A%22Witek%22%7D%5D%2C%22abstractNote%22%3A%22Black%20holes%20offer%20a%20unique%20laboratory%20for%20fundamental%20physics%20and%20are%20crucial%20for%20probing%20theories%20beyond%20Einstein%27s%20theory%20of%20General%20Relativity.%20In%20this%20paper%2C%20we%20consider%204D%20effective%20field%20theories%20with%20scalar%20fields.%20We%20focus%20on%20axi-dilaton%20gravity%2C%20a%20quadratic%20gravity%20theory%20with%20two%20kinetically%20coupled%20scalar%20fields%2C%20an%20axion%20and%20a%20dilaton.%20To%20evolve%20these%20fields%20around%20black%20holes%2C%20we%20introduce%20Canuda-AxiDil%2C%20the%20first%20open-source%2C%20parameterized%20numerical%20relativity%20code%20for%20quadratic%20and%20bi-scalar%20gravity.%20Using%20this%20code%2C%20we%20perform%20single%20black%20hole%20simulations%20to%20show%20the%20dynamical%20formation%20of%20axion%20and%20dilaton%20hairs.%20Through%20these%20simulations%2C%20we%20measure%20the%20impact%20of%20black-hole%20spin%20and%20curvature%20coupling%20strength%20on%20the%20axion%20and%20dilaton%2C%20and%20show%20that%20a%20kinetic%20coupling%20between%20the%20fields%20increases%20the%20observed%20deviations%20from%20General%20Relativity.%20Furthermore%2C%20we%20simulate%20the%20axion%20and%20dilaton%20fields%20around%20a%20binary%20black%20hole%20coalescence%20demonstrating%20the%20growth%20of%20axion%20hair%20during%20the%20inspiral%20and%20the%20production%20of%20radiative%20modes%20for%20both%20fields.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222025%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2501.14034%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2501.14034%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T22%3A37%3A46Z%22%7D%7D%2C%7B%22key%22%3A%22GXI7LZRB%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Osorio%20et%20al.%22%2C%22parsedDate%22%3A%222024-11-26%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EOsorio%2C%20J.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Keep%20it%20Local%3A%20Comparing%20Domain-Specific%20LLMs%20in%20Native%20and%20Machine%20Translated%20Text%20using%20Parallel%20Corpora%20on%20Political%20Conflict.%20in%20%3Ci%3E2024%202nd%20International%20Conference%20on%20Foundation%20and%20Large%20Language%20Models%20%28FLLM%29%3C%5C%2Fi%3E%20542%26%23x2013%3B552%20%28IEEE%2C%20Dubai%2C%20United%20Arab%20Emirates%2C%202024%29.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27http%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FFLLM63129.2024.10852489%27%3Ehttp%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FFLLM63129.2024.10852489%3C%5C%2Fa%3E.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Keep%20it%20Local%3A%20Comparing%20Domain-Specific%20LLMs%20in%20Native%20and%20Machine%20Translated%20Text%20using%20Parallel%20Corpora%20on%20Political%20Conflict%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Javier%22%2C%22lastName%22%3A%22Osorio%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sultan%22%2C%22lastName%22%3A%22Alsarra%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amber%22%2C%22lastName%22%3A%22Converse%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Afraa%22%2C%22lastName%22%3A%22Alshammari%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Heintze%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Latifur%22%2C%22lastName%22%3A%22Khan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Naif%22%2C%22lastName%22%3A%22Alatrush%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Patrick%20T.%22%2C%22lastName%22%3A%22Brandt%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Vito%22%2C%22lastName%22%3A%22D%5Cu2019Orazio%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Niamat%22%2C%22lastName%22%3A%22Zawad%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mahrusa%22%2C%22lastName%22%3A%22Billah%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-11-26%22%2C%22proceedingsTitle%22%3A%222024%202nd%20International%20Conference%20on%20Foundation%20and%20Large%20Language%20Models%20%28FLLM%29%22%2C%22conferenceName%22%3A%222024%202nd%20International%20Conference%20on%20Foundation%20and%20Large%20Language%20Models%20%28FLLM%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FFLLM63129.2024.10852489%22%2C%22ISBN%22%3A%229798350354799%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10852489%5C%2F%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T22%3A29%3A01Z%22%7D%7D%2C%7B%22key%22%3A%22I4F4MX2R%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Avdiunina%20et%20al.%22%2C%22parsedDate%22%3A%222025-01-22%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EAvdiunina%2C%20P.%2C%20Jamal%2C%20S.%2C%20Gusev%2C%20F.%20%26amp%3B%20Isayev%2C%20O.%20All%20that%20glitters%20is%20not%20gold%3A%20Importance%20of%20rigorous%20evaluation%20of%20proteochemometric%20models.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.26434%5C%2Fchemrxiv-2025-vbmgc%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.26434%5C%2Fchemrxiv-2025-vbmgc%3C%5C%2Fa%3E%20%282025%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22All%20that%20glitters%20is%20not%20gold%3A%20Importance%20of%20rigorous%20evaluation%20of%20proteochemometric%20models%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Polina%22%2C%22lastName%22%3A%22Avdiunina%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shamieraah%22%2C%22lastName%22%3A%22Jamal%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Filipp%22%2C%22lastName%22%3A%22Gusev%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Olexandr%22%2C%22lastName%22%3A%22Isayev%22%7D%5D%2C%22abstractNote%22%3A%22Proteochemometric%20models%20%28PCM%29%20are%20used%20in%20computational%20drug%20discovery%20to%20leverage%20both%20protein%20and%20ligand%20representations%20for%20bioactivity%20prediction.%20While%20machine%20learning%20%28ML%29%20and%20deep%20learning%20%28DL%29%20have%20come%20to%20dominate%20PCMs%2C%20often%20serving%20as%20scoring%20functions%2C%20rigorous%20evaluation%20standards%20have%20not%20always%20been%20consistently%20applied.%20In%20this%20study%2C%20using%20kinase-ligand%20bioactivity%20prediction%20as%20a%20model%20system%2C%20we%20highlight%20the%20critical%20roles%20of%20dataset%20curation%2C%20permutation%20testing%2C%20class%20imbalances%2C%20data%20splitting%20strategies%2C%20and%20embedding%20quality%20in%20determining%20model%20performance.%20Our%20findings%20indicate%20that%20data%20splitting%20and%20class%20imbalances%20are%20the%20most%20critical%20factors%20affecting%20PCM%20performance%2C%20emphasizing%20the%20challenges%20in%20generalizing%20ability%20of%20ML%5C%2FDL-PCMs.%20We%20evaluated%20various%20protein-ligand%20descriptors%20and%20embeddings%2C%20including%20those%20augmented%20with%20multiple%20sequence%20alignment%20%28MSA%29%20information.%20However%2C%20permutation%20testing%20consistently%20demonstrated%20that%20protein%20embeddings%20contributed%20minimally%20to%20PCM%20efficacy.%20This%20study%20advocates%20for%20the%20adoption%20of%20stringent%20evaluation%20standards%20to%20enhance%20the%20generalizability%20of%20models%20to%20out-of-distribution%20data%20and%20improve%20benchmarking%20practices.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222025-01-22%22%2C%22DOI%22%3A%2210.26434%5C%2Fchemrxiv-2025-vbmgc%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fchemrxiv.org%5C%2Fengage%5C%2Fchemrxiv%5C%2Farticle-details%5C%2F678f32006dde43c908774ef1%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T22%3A17%3A07Z%22%7D%7D%2C%7B%22key%22%3A%22S7DPSCD5%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Pilny%20et%20al.%22%2C%22parsedDate%22%3A%222025-01-22%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EPilny%2C%20A.%2C%20Bonito%2C%20J.%20%26amp%3B%20Schecter%2C%20A.%20Coding%20Small%20Group%20Communication%20with%20AI%3A%20RNNs%20and%20Transformers%20with%20Context.%20%3Ci%3ESmall%20Group%20Research%3C%5C%2Fi%3E%2010464964251314196%20%282025%29%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27http%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F10464964251314197%27%3Ehttp%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F10464964251314197%3C%5C%2Fa%3E.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Coding%20Small%20Group%20Communication%20with%20AI%3A%20RNNs%20and%20Transformers%20with%20Context%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andrew%22%2C%22lastName%22%3A%22Pilny%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Joseph%22%2C%22lastName%22%3A%22Bonito%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aaron%22%2C%22lastName%22%3A%22Schecter%22%7D%5D%2C%22abstractNote%22%3A%22This%20study%20compares%20the%20performance%20of%20recurrent%20neural%20networks%20%28RNNs%29%20and%20transformer-based%20models%20%28DistilBERT%29%20in%20classifying%20utterances%20as%20dialogue%20acts.%20The%20results%20show%20that%20transformers%20consistently%20outperform%20RNNs%2C%20highlighting%20their%20usefulness%20in%20coding%20small%20group%20interaction.%20Furthermore%2C%20the%20study%20explores%20the%20impact%20of%20incorporating%20context%2C%20in%20the%20form%20of%20preceding%20and%20following%20utterances.%20The%20findings%20reveal%20that%20adding%20context%20leads%20to%20modest%20improvements%20in%20model%20performance.%20Moreover%2C%20in%20some%20cases%2C%20adding%20context%20can%20lead%20to%20a%20slight%20decrease%20in%20performance.%20The%20study%20discusses%20the%20implications%20of%20these%20findings%20for%20small%20group%20researchers%20employing%20AI%20models%20for%20text%20classification%20tasks.%22%2C%22date%22%3A%222025-01-22%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1177%5C%2F10464964251314197%22%2C%22ISSN%22%3A%221046-4964%2C%201552-8278%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fjournals.sagepub.com%5C%2Fdoi%5C%2F10.1177%5C%2F10464964251314197%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T22%3A15%3A15Z%22%7D%7D%2C%7B%22key%22%3A%22PNIR7KPE%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Deng%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EDeng%2C%20J.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%24%5C%5Ctexttt%7Bdattri%7D%24%3A%20A%20Library%20for%20Efficient%20Data%20Attribution.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.04555%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.04555%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22%24%5C%5Ctexttt%7Bdattri%7D%24%3A%20A%20Library%20for%20Efficient%20Data%20Attribution%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Junwei%22%2C%22lastName%22%3A%22Deng%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ting-Wei%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shiyuan%22%2C%22lastName%22%3A%22Zhang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shixuan%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yijun%22%2C%22lastName%22%3A%22Pan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hao%22%2C%22lastName%22%3A%22Huang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xinhe%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pingbang%22%2C%22lastName%22%3A%22Hu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xingjian%22%2C%22lastName%22%3A%22Zhang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jiaqi%20W.%22%2C%22lastName%22%3A%22Ma%22%7D%5D%2C%22abstractNote%22%3A%22Data%20attribution%20methods%20aim%20to%20quantify%20the%20influence%20of%20individual%20training%20samples%20on%20the%20prediction%20of%20artificial%20intelligence%20%28AI%29%20models.%20As%20training%20data%20plays%20an%20increasingly%20crucial%20role%20in%20the%20modern%20development%20of%20large-scale%20AI%20models%2C%20data%20attribution%20has%20found%20broad%20applications%20in%20improving%20AI%20performance%20and%20safety.%20However%2C%20despite%20a%20surge%20of%20new%20data%20attribution%20methods%20being%20developed%20recently%2C%20there%20lacks%20a%20comprehensive%20library%20that%20facilitates%20the%20development%2C%20benchmarking%2C%20and%20deployment%20of%20different%20data%20attribution%20methods.%20In%20this%20work%2C%20we%20introduce%20%24%5C%5Ctexttt%7Bdattri%7D%24%2C%20an%20open-source%20data%20attribution%20library%20that%20addresses%20the%20above%20needs.%20Specifically%2C%20%24%5C%5Ctexttt%7Bdattri%7D%24%20highlights%20three%20novel%20design%20features.%20Firstly%2C%20%24%5C%5Ctexttt%7Bdattri%7D%24%20proposes%20a%20unified%20and%20easy-to-use%20API%2C%20allowing%20users%20to%20integrate%20different%20data%20attribution%20methods%20into%20their%20PyTorch-based%20machine%20learning%20pipeline%20with%20a%20few%20lines%20of%20code%20changed.%20Secondly%2C%20%24%5C%5Ctexttt%7Bdattri%7D%24%20modularizes%20low-level%20utility%20functions%20that%20are%20commonly%20used%20in%20data%20attribution%20methods%2C%20such%20as%20Hessian-vector%20product%2C%20inverse-Hessian-vector%20product%20or%20random%20projection%2C%20making%20it%20easier%20for%20researchers%20to%20develop%20new%20data%20attribution%20methods.%20Thirdly%2C%20%24%5C%5Ctexttt%7Bdattri%7D%24%20provides%20a%20comprehensive%20benchmark%20framework%20with%20pre-trained%20models%20and%20ground%20truth%20annotations%20for%20a%20variety%20of%20benchmark%20settings%2C%20including%20generative%20AI%20settings.%20We%20have%20implemented%20a%20variety%20of%20state-of-the-art%20efficient%20data%20attribution%20methods%20that%20can%20be%20applied%20to%20large-scale%20neural%20network%20models%2C%20and%20will%20continuously%20update%20the%20library%20in%20the%20future.%20Using%20the%20developed%20%24%5C%5Ctexttt%7Bdattri%7D%24%20library%2C%20we%20are%20able%20to%20perform%20a%20comprehensive%20and%20fair%20benchmark%20analysis%20across%20a%20wide%20range%20of%20data%20attribution%20methods.%20The%20source%20code%20of%20%24%5C%5Ctexttt%7Bdattri%7D%24%20is%20available%20at%20https%3A%5C%2F%5C%2Fgithub.com%5C%2FTRAIS-Lab%5C%2Fdattri.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2410.04555%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2410.04555%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T22%3A10%3A23Z%22%7D%7D%2C%7B%22key%22%3A%22G3GYIZP8%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chen%20et%20al.%22%2C%22parsedDate%22%3A%222024-12-02%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EChen%2C%20W.%2C%20Yan%2C%20B.%2C%20Chen%2C%20C.-C.%20%26amp%3B%20Watanabe%2C%20S.%20Floras%2050%3A%20A%20Massively%20Multilingual%20Multitask%20Benchmark%20for%20Long-Form%20Conversational%20Speech.%20in%20%3Ci%3E2024%20IEEE%20Spoken%20Language%20Technology%20Workshop%20%28SLT%29%3C%5C%2Fi%3E%20891%26%23x2013%3B898%20%28IEEE%2C%20Macao%2C%202024%29.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27http%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FSLT61566.2024.10832167%27%3Ehttp%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FSLT61566.2024.10832167%3C%5C%2Fa%3E.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Floras%2050%3A%20A%20Massively%20Multilingual%20Multitask%20Benchmark%20for%20Long-Form%20Conversational%20Speech%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22William%22%2C%22lastName%22%3A%22Chen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Brian%22%2C%22lastName%22%3A%22Yan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chih-Chen%22%2C%22lastName%22%3A%22Chen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shinji%22%2C%22lastName%22%3A%22Watanabe%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-12-2%22%2C%22proceedingsTitle%22%3A%222024%20IEEE%20Spoken%20Language%20Technology%20Workshop%20%28SLT%29%22%2C%22conferenceName%22%3A%222024%20IEEE%20Spoken%20Language%20Technology%20Workshop%20%28SLT%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FSLT61566.2024.10832167%22%2C%22ISBN%22%3A%229798350392258%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10832167%5C%2F%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T21%3A36%3A11Z%22%7D%7D%2C%7B%22key%22%3A%22LWNI38D2%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Nakamura%20et%20al.%22%2C%22parsedDate%22%3A%222025%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ENakamura%2C%20T.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Discrete%20Speech%20Unit%20Extraction%20via%20Independent%20Component%20Analysis.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2501.06562%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2501.06562%3C%5C%2Fa%3E%20%282025%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Discrete%20Speech%20Unit%20Extraction%20via%20Independent%20Component%20Analysis%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tomohiko%22%2C%22lastName%22%3A%22Nakamura%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kwanghee%22%2C%22lastName%22%3A%22Choi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Keigo%22%2C%22lastName%22%3A%22Hojo%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yoshiaki%22%2C%22lastName%22%3A%22Bando%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Satoru%22%2C%22lastName%22%3A%22Fukayama%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shinji%22%2C%22lastName%22%3A%22Watanabe%22%7D%5D%2C%22abstractNote%22%3A%22Self-supervised%20speech%20models%20%28S3Ms%29%20have%20become%20a%20common%20tool%20for%20the%20speech%20processing%20community%2C%20leveraging%20representations%20for%20downstream%20tasks.%20Clustering%20S3M%20representations%20yields%20discrete%20speech%20units%20%28DSUs%29%2C%20which%20serve%20as%20compact%20representations%20for%20speech%20signals.%20DSUs%20are%20typically%20obtained%20by%20k-means%20clustering.%20Using%20DSUs%20often%20leads%20to%20strong%20performance%20in%20various%20tasks%2C%20including%20automatic%20speech%20recognition%20%28ASR%29.%20However%2C%20even%20with%20the%20high%20dimensionality%20and%20redundancy%20of%20S3M%20representations%2C%20preprocessing%20S3M%20representations%20for%20better%20clustering%20remains%20unexplored%2C%20even%20though%20it%20can%20affect%20the%20quality%20of%20DSUs.%20In%20this%20paper%2C%20we%20investigate%20the%20potential%20of%20linear%20preprocessing%20methods%20for%20extracting%20DSUs.%20We%20evaluate%20standardization%2C%20principal%20component%20analysis%2C%20whitening%2C%20and%20independent%20component%20analysis%20%28ICA%29%20on%20DSU-based%20ASR%20benchmarks%20and%20demonstrate%20their%20effectiveness%20as%20preprocessing%20for%20k-means.%20We%20also%20conduct%20extensive%20analyses%20of%20their%20behavior%2C%20such%20as%20orthogonality%20or%20interpretability%20of%20individual%20components%20of%20ICA.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222025%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2501.06562%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2501.06562%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A19%3A02Z%22%7D%7D%2C%7B%22key%22%3A%22CGS26BTG%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Khot%20et%20al.%22%2C%22parsedDate%22%3A%222025%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EKhot%2C%20A.%2C%20Wang%2C%20X.%2C%20Roy%2C%20A.%2C%20Kindratenko%2C%20V.%20%26amp%3B%20Neubauer%2C%20M.%20S.%20Evidential%20Deep%20Learning%20for%20Uncertainty%20Quantification%20and%20Out-of-Distribution%20Detection%20in%20Jet%20Identification%20using%20Deep%20Neural%20Networks.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2501.05656%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2501.05656%3C%5C%2Fa%3E%20%282025%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Evidential%20Deep%20Learning%20for%20Uncertainty%20Quantification%20and%20Out-of-Distribution%20Detection%20in%20Jet%20Identification%20using%20Deep%20Neural%20Networks%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ayush%22%2C%22lastName%22%3A%22Khot%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xiwei%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Avik%22%2C%22lastName%22%3A%22Roy%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Volodymyr%22%2C%22lastName%22%3A%22Kindratenko%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mark%20S.%22%2C%22lastName%22%3A%22Neubauer%22%7D%5D%2C%22abstractNote%22%3A%22Current%20methods%20commonly%20used%20for%20uncertainty%20quantification%20%28UQ%29%20in%20deep%20learning%20%28DL%29%20models%20utilize%20Bayesian%20methods%20which%20are%20computationally%20expensive%20and%20time-consuming.%20In%20this%20paper%2C%20we%20provide%20a%20detailed%20study%20of%20UQ%20based%20on%20evidential%20deep%20learning%20%28EDL%29%20for%20deep%20neural%20network%20models%20designed%20to%20identify%20jets%20in%20high%20energy%20proton-proton%20collisions%20at%20the%20Large%20Hadron%20Collider%20and%20explore%20its%20utility%20in%20anomaly%20detection.%20EDL%20is%20a%20DL%20approach%20that%20treats%20learning%20as%20an%20evidence%20acquisition%20process%20designed%20to%20provide%20confidence%20%28or%20epistemic%20uncertainty%29%20about%20test%20data.%20Using%20publicly%20available%20datasets%20for%20jet%20classification%20benchmarking%2C%20we%20explore%20hyperparameter%20optimizations%20for%20EDL%20applied%20to%20the%20challenge%20of%20UQ%20for%20jet%20identification.%20We%20also%20investigate%20how%20the%20uncertainty%20is%20distributed%20for%20each%20jet%20class%2C%20how%20this%20method%20can%20be%20implemented%20for%20the%20detection%20of%20anomalies%2C%20how%20the%20uncertainty%20compares%20with%20Bayesian%20ensemble%20methods%2C%20and%20how%20the%20uncertainty%20maps%20onto%20latent%20spaces%20for%20the%20models.%20Our%20studies%20uncover%20some%20pitfalls%20of%20EDL%20applied%20to%20anomaly%20detection%20and%20a%20more%20effective%20way%20to%20quantify%20uncertainty%20from%20EDL%20as%20compared%20with%20the%20foundational%20EDL%20setup.%20These%20studies%20illustrate%20a%20methodological%20approach%20to%20interpreting%20EDL%20in%20jet%20classification%20models%2C%20providing%20new%20insights%20on%20how%20EDL%20quantifies%20uncertainty%20and%20detects%20out-of-distribution%20data%20which%20may%20lead%20to%20improved%20EDL%20methods%20for%20DL%20models%20applied%20to%20classification%20tasks.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222025%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2501.05656%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2501.05656%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A18%3A46Z%22%7D%7D%2C%7B%22key%22%3A%22MY5DUZTJ%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Andrews%20et%20al.%22%2C%22parsedDate%22%3A%222025-01-23%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EAndrews%2C%20J.%2C%20Weirich%2C%20K.%20%26amp%3B%20Schiller%2C%20U.%20D.%20Molecular-Scale%20Simulation%20of%20Wetting%20of%20Actin%20Filaments%20by%20Protein%20Droplets.%20%3Ci%3EJ.%20Phys.%20Chem.%20B%3C%5C%2Fi%3E%20%3Cb%3E129%3C%5C%2Fb%3E%2C%201109%26%23x2013%3B1121%20%282025%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Molecular-Scale%20Simulation%20of%20Wetting%20of%20Actin%20Filaments%20by%20Protein%20Droplets%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22James%22%2C%22lastName%22%3A%22Andrews%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kimberly%22%2C%22lastName%22%3A%22Weirich%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ulf%20D.%22%2C%22lastName%22%3A%22Schiller%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222025-01-23%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1021%5C%2Facs.jpcb.4c07282%22%2C%22ISSN%22%3A%221520-6106%2C%201520-5207%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fpubs.acs.org%5C%2Fdoi%5C%2F10.1021%5C%2Facs.jpcb.4c07282%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A18%3A26Z%22%7D%7D%2C%7B%22key%22%3A%22J5M5FIB7%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Wang%20et%20al.%22%2C%22parsedDate%22%3A%222025-01-10%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EWang%2C%20S.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Deep%20CNN-based%20semi-supervised%20learning%20approach%20for%20identifying%20and%20segmenting%20corrosion%20in%20hydraulic%20steel%20and%20water%20resources%20infrastructure.%20%3Ci%3EStructural%20Health%20Monitoring%3C%5C%2Fi%3E%2014759217241305040%20%282025%29%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27http%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F14759217241305039%27%3Ehttp%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1177%5C%2F14759217241305039%3C%5C%2Fa%3E.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Deep%20CNN-based%20semi-supervised%20learning%20approach%20for%20identifying%20and%20segmenting%20corrosion%20in%20hydraulic%20steel%20and%20water%20resources%20infrastructure%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shengyi%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hai%22%2C%22lastName%22%3A%22Nguyen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rebekah%22%2C%22lastName%22%3A%22Wilson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Brian%22%2C%22lastName%22%3A%22Eick%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nora%22%2C%22lastName%22%3A%22El-Gohary%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Carolyn%22%2C%22lastName%22%3A%22Ortiz%22%7D%5D%2C%22abstractNote%22%3A%22The%20United%20States%20faces%20significant%20challenges%20due%20to%20corrosion%2C%20with%20its%20impact%20on%20military%20and%20civilian%20infrastructure%20incurring%20over%20%2420%20billion%20in%20annual%20maintenance%20costs.%20The%20damage%20due%20to%20corrosion%20is%20profound%2C%20threatening%20structural%20safety%2C%20reducing%20esthetic%20value%2C%20and%20leading%20to%20costly%20repairs.%20To%20mitigate%20these%20effects%2C%20the%20Unified%20Facilities%20Criteria%20and%20Unified%20Facilities%20Guidance%20Specifications%20advise%20the%20use%20of%20protective%20coatings%20on%20metal%20surfaces.%20Early%20corrosion%20detection%20is%20crucial%20for%20maintaining%20structural%20integrity%20and%20minimizing%20maintenance%20costs.%20Recent%20breakthroughs%20in%20artificial%20intelligence%20and%20deep%20learning%2C%20including%20accurate%20corrosion%20classification%2C%20have%20significantly%20revolutionized%20the%20detection%20and%20management%20of%20corrosion.%20Despite%20these%20advancements%2C%20automatic%20corrosion%20segmentation%20in%20civil%20infrastructure%20remains%20challenging%20due%20to%20the%20scarcity%20of%20images%20and%20the%20labor-intensive%20annotation%20process.%20Moreover%2C%20existing%20segmentation%20methods%20are%20unable%20to%20manage%20the%20complexities%20that%20come%20with%20high-resolution%20corrosion%20images.%20This%20paper%20proposes%20a%20novel%2C%20semi-supervised%2C%20convolutional%20neural%20network-based%20image%20segmentation%20method%20for%20the%20automatic%20identification%20and%20segmentation%20of%20corrosion%20on%20coated%20steel%20surfaces%2C%20using%20both%20unlabeled%20and%20labeled%20corrosion%20images%20and%20leveraging%20the%20mean%20teacher%20model.%20The%20proposed%20novel%20method%20involves%20three%20steps%3A%20%281%29%20utilizing%20high-resolution%20digital%20microscopy%20to%20capture%20detailed%20images%20and%20dividing%20them%20into%20manageable%20patches%3B%20%282%29%20applying%20a%20semi-supervised%20learning%20approach%2C%20leveraging%20unlabeled%20corrosion%20images%20for%20enhanced%20segmentation%20precision%3B%20and%20%283%29%20employing%20a%20smoothing%20module%20to%20improve%20the%20continuity%20of%20information.%20The%20proposed%20corrosion%20detection%20method%20has%20demonstrated%20promising%20performance%20with%20only%2067%25%20labeled%20data%2C%20achieving%20mean%20precision%2C%20recall%2C%20F-1%20measure%2C%20and%20intersection%20over%20union%20of%2090.0%25%2C%2096.2%25%2C%2092.7%25%2C%20and%2087.1%25%2C%20respectively.%20Even%20with%20just%2033%25%20labeled%20data%2C%20the%20method%20maintains%20strong%20performance%20when%20compared%20to%20fully%20supervised%20deep%20learning%20models.%20This%20demonstrates%20a%20substantial%20data%20resource%20saving%20while%20ensuring%20accurate%20and%20reliable%20corrosion%20detection%2C%20which%20is%20crucial%20for%20infrastructure%20health%20monitoring.%20The%20successful%20validation%20of%20this%20approach%20provides%20a%20method%20that%20dramatically%20reduces%20the%20amount%20of%20visual%20data%20required%20to%20generate%20a%20reliable%20model.%22%2C%22date%22%3A%222025-01-10%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1177%5C%2F14759217241305039%22%2C%22ISSN%22%3A%221475-9217%2C%201741-3168%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fjournals.sagepub.com%5C%2Fdoi%5C%2F10.1177%5C%2F14759217241305039%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A18%3A11Z%22%7D%7D%2C%7B%22key%22%3A%22SHNFFNVK%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Feng%20et%20al.%22%2C%22parsedDate%22%3A%222025-01-05%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EFeng%2C%20J.%20T.%2C%20Satheesan%2C%20S.%20P.%2C%20Kong%2C%20S.%2C%20Donders%2C%20T.%20H.%20%26amp%3B%20Punyasena%2C%20S.%20W.%20Addressing%20the%20open%20world%3A%20detecting%20and%20segmenting%20pollen%20on%20palynological%20slides%20with%20deep%20learning.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2025.01.05.631390%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2025.01.05.631390%3C%5C%2Fa%3E%20%282025%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Addressing%20the%20open%20world%3A%20detecting%20and%20segmenting%20pollen%20on%20palynological%20slides%20with%20deep%20learning%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jennifer%20T.%22%2C%22lastName%22%3A%22Feng%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sandeep%20Puthanveetil%22%2C%22lastName%22%3A%22Satheesan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shu%22%2C%22lastName%22%3A%22Kong%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Timme%20H.%22%2C%22lastName%22%3A%22Donders%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Surangi%20W.%22%2C%22lastName%22%3A%22Punyasena%22%7D%5D%2C%22abstractNote%22%3A%22In%20the%20open%20world%2C%20categorical%20classes%20are%20imbalanced%2C%20test%20classes%20are%20not%20known%20a%20priori%2C%20and%20test%20data%20are%20captured%20across%20different%20domains.%20Paleontological%20data%20can%20be%20described%20as%20open-world%2C%20as%20specimens%20may%20include%20new%2C%20unknown%20taxa%2C%20and%20the%20data%20collected%2C%20such%20as%20measurements%20or%20images%2C%20may%20not%20be%20standardized%20across%20different%20studies.%20Fossil%20pollen%20analysis%20is%20one%20example%20of%20an%20open-world%20problem%20in%20paleontology.%20Pollen%20samples%20capture%20large%20numbers%20of%20specimens%2C%20including%20not%20only%20common%20types%20but%20also%20rare%20and%20even%20novel%20taxa.%20Pollen%20is%20diverse%20morphologically%20and%20features%20can%20be%20altered%20during%20fossilization.%20Additionally%2C%20there%20is%20little%20standardization%20in%20the%20methods%20used%20to%20capture%20and%20catalog%20pollen%20images%20and%20most%20collections%20are%20mounted%20on%20microscope%20slides.%20Therefore%2C%20generalized%20workflows%20for%20automated%20pollen%20analysis%20require%20techniques%20that%20are%20robust%20to%20these%20differences%20and%20can%20work%20with%20microscope%20images.%20We%20focus%20on%20a%20critical%20first%20step%2C%20the%20detection%20of%20pollen%20specimens%20on%20a%20palynological%20slide%20and%20review%20how%20existing%20methods%20can%20be%20employed%20to%20build%20robust%20and%20generalizable%20analysis%20pipelines.%20First%2C%20we%20demonstrate%20how%20a%20mixture-of-experts%20approach%20--%20the%20fusion%20of%20a%20general%20pollen%20detector%20with%20an%20expert%20model%20trained%20on%20minority%20classes%20--%20can%20be%20used%20to%20address%20taxonomic%20biases%20in%20detections%2C%20particularly%20the%20missed%20detections%20of%20rarer%20pollen%20types.%20Second%2C%20we%20demonstrate%20the%20efficiency%20of%20domain%20fine-tuning%20in%20addressing%20domain%20gaps%20--%20differences%20in%20image%20magnification%20and%20resolution%20across%20microscopes%2C%20and%20of%20taxa%20across%20different%20sample%20sources.%20Third%2C%20we%20demonstrate%20the%20importance%20of%20continual%20learning%20workflows%2C%20which%20integrate%20expert%20feedback%2C%20in%20training%20detection%20models%20from%20incomplete%20data.%20Finally%2C%20we%20demonstrate%20how%20cutting-edge%20segmentation%20models%20can%20be%20used%20to%20refine%20and%20clean%20detections%20for%20downstream%20deep%20learning%20classification%20models.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222025-01-05%22%2C%22DOI%22%3A%2210.1101%5C%2F2025.01.05.631390%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fbiorxiv.org%5C%2Flookup%5C%2Fdoi%5C%2F10.1101%5C%2F2025.01.05.631390%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A17%3A45Z%22%7D%7D%2C%7B%22key%22%3A%22FMC7GJBX%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Vatansever%20and%20Levin%22%2C%22parsedDate%22%3A%222025%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EVatansever%2C%20D.%20%26amp%3B%20Levin%2C%20D.%20Collisionless%20Plasma%20Plume%20Expansion%20Under%20External%20Magnetic%20Fields.%20%282025%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22presentation%22%2C%22title%22%3A%22Collisionless%20Plasma%20Plume%20Expansion%20Under%20External%20Magnetic%20Fields%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22presenter%22%2C%22firstName%22%3A%22Davut%22%2C%22lastName%22%3A%22Vatansever%22%7D%2C%7B%22creatorType%22%3A%22presenter%22%2C%22firstName%22%3A%22Deborah%22%2C%22lastName%22%3A%22Levin%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222025%2C%2001%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farc.aiaa.org%5C%2Fdoi%5C%2F10.2514%5C%2F6.2025-2491%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A17%3A19Z%22%7D%7D%2C%7B%22key%22%3A%22C2PE5Z3Q%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Wu%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EWu%2C%20Y.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Enhancing%20Audiovisual%20Speech%20Recognition%20through%20Bifocal%20Preference%20Optimization.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.19005%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.19005%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Enhancing%20Audiovisual%20Speech%20Recognition%20through%20Bifocal%20Preference%20Optimization%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yihan%22%2C%22lastName%22%3A%22Wu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yichen%22%2C%22lastName%22%3A%22Lu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yifan%22%2C%22lastName%22%3A%22Peng%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xihua%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ruihua%22%2C%22lastName%22%3A%22Song%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shinji%22%2C%22lastName%22%3A%22Watanabe%22%7D%5D%2C%22abstractNote%22%3A%22Audiovisual%20Automatic%20Speech%20Recognition%20%28AV-ASR%29%20aims%20to%20improve%20speech%20recognition%20accuracy%20by%20leveraging%20visual%20signals.%20It%20is%20particularly%20challenging%20in%20unconstrained%20real-world%20scenarios%20across%20various%20domains%20due%20to%20noisy%20acoustic%20environments%2C%20spontaneous%20speech%2C%20and%20the%20uncertain%20use%20of%20visual%20information.%20Most%20previous%20works%20fine-tune%20audio-only%20ASR%20models%20on%20audiovisual%20datasets%2C%20optimizing%20them%20for%20conventional%20ASR%20objectives.%20However%2C%20they%20often%20neglect%20visual%20features%20and%20common%20errors%20in%20unconstrained%20video%20scenarios.%20In%20this%20paper%2C%20we%20propose%20using%20a%20preference%20optimization%20strategy%20to%20improve%20speech%20recognition%20accuracy%20for%20real-world%20videos.%20First%2C%20we%20create%20preference%20data%20via%20simulating%20common%20errors%20that%20occurred%20in%20AV-ASR%20from%20two%20focals%3A%20manipulating%20the%20audio%20or%20vision%20input%20and%20rewriting%20the%20output%20transcript.%20Second%2C%20we%20propose%20BPO-AVASR%2C%20a%20Bifocal%20Preference%20Optimization%20method%20to%20improve%20AV-ASR%20models%20by%20leveraging%20both%20input-side%20and%20output-side%20preference.%20Extensive%20experiments%20demonstrate%20that%20our%20approach%20significantly%20improves%20speech%20recognition%20accuracy%20across%20various%20domains%2C%20outperforming%20previous%20state-of-the-art%20models%20on%20real-world%20video%20speech%20recognition.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2412.19005%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2412.19005%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A08%3A44Z%22%7D%7D%2C%7B%22key%22%3A%22TNL3N8TB%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Imam%20et%20al.%22%2C%22parsedDate%22%3A%222025-01-14%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EImam%2C%20I.%20A.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Integrating%20Protein%20Language%20Model%20and%20Molecular%20Dynamics%20Simulations%20to%20Discover%20Antibiofouling%20Peptides.%20%3Ci%3ELangmuir%3C%5C%2Fi%3E%20%3Cb%3E41%3C%5C%2Fb%3E%2C%20811%26%23x2013%3B821%20%282025%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Integrating%20Protein%20Language%20Model%20and%20Molecular%20Dynamics%20Simulations%20to%20Discover%20Antibiofouling%20Peptides%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ibrahim%20A.%22%2C%22lastName%22%3A%22Imam%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shea%22%2C%22lastName%22%3A%22Bailey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Duolin%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shuai%22%2C%22lastName%22%3A%22Zeng%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dong%22%2C%22lastName%22%3A%22Xu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Qing%22%2C%22lastName%22%3A%22Shao%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222025-01-14%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1021%5C%2Facs.langmuir.4c04140%22%2C%22ISSN%22%3A%220743-7463%2C%201520-5827%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fpubs.acs.org%5C%2Fdoi%5C%2F10.1021%5C%2Facs.langmuir.4c04140%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A08%3A24Z%22%7D%7D%2C%7B%22key%22%3A%22G2EUDFPM%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kobayashi%20and%20Alam%22%2C%22parsedDate%22%3A%222024-12-28%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EKobayashi%2C%20K.%20%26amp%3B%20Alam%2C%20S.%20B.%20Physics-regularized%20neural%20networks%20for%20predictive%20modeling%20of%20silicon%20carbide%20swelling%20with%20limited%20experimental%20data.%20%3Ci%3ESci%20Rep%3C%5C%2Fi%3E%20%3Cb%3E14%3C%5C%2Fb%3E%2C%2030666%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Physics-regularized%20neural%20networks%20for%20predictive%20modeling%20of%20silicon%20carbide%20swelling%20with%20limited%20experimental%20data%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kazuma%22%2C%22lastName%22%3A%22Kobayashi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Syed%20Bahauddin%22%2C%22lastName%22%3A%22Alam%22%7D%5D%2C%22abstractNote%22%3A%22Abstract%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20This%20study%20introduces%20a%20physics-regularized%20neural%20network%20%28PRNN%29%20as%20a%20computational%20approach%20to%20predict%20silicon%20carbide%5Cu2019s%20%28SiC%29%20swelling%20under%20irradiation%2C%20particularly%20at%20high%20temperatures.%5Cu00a0The%20PRNN%20model%20combines%20physics-based%20regularization%20with%20neural%20network%20methodologies%20to%20generalize%20the%20behavior%20of%20SiC%2C%20even%20in%20conditions%20beyond%20the%20traditional%20empirical%20model%5Cu2019s%20valid%20range.%20This%20approach%20ensures%20continuity%20and%20accuracy%20in%20SiC%20behavior%20predictions%20in%20extreme%20environments.%20A%20key%20aspect%20of%20this%20research%20is%20using%20nested%20cross-validation%20to%20ensure%20robustness%20and%20generalizability.%20The%20PRNN%20model%20effectively%20bridges%20empirical%20and%20sparse%20experimental%20data%20by%20integrating%20prior%20knowledge%20and%20refined%20tuning%20procedures.%20It%20demonstrates%20its%20SiC%5Cu2019s%20predictive%20power%20in%20high-irradiation%20conditions%20essential%20for%20nuclear%20and%20aerospace%20applications.%22%2C%22date%22%3A%222024-12-28%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1038%5C%2Fs41598-024-78037-7%22%2C%22ISSN%22%3A%222045-2322%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.nature.com%5C%2Farticles%5C%2Fs41598-024-78037-7%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A08%3A08Z%22%7D%7D%2C%7B%22key%22%3A%22THPTPJG2%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hassan%20et%20al.%22%2C%22parsedDate%22%3A%222024-12-02%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EHassan%2C%20U.%2C%20Zhu%2C%20J.%2C%20Chen%2C%20D.%20%26amp%3B%20Cheung%2C%20S.-C.%20S.%20DPGEM%3A%20Differentially%20Private%20Generative%20Model%20with%20Exponential%20Mechanism.%20in%20%3Ci%3E2024%20IEEE%20International%20Workshop%20on%20Information%20Forensics%20and%20Security%20%28WIFS%29%3C%5C%2Fi%3E%201%26%23x2013%3B6%20%28IEEE%2C%20Rome%2C%20Italy%2C%202024%29.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27http%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FWIFS61860.2024.10810705%27%3Ehttp%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FWIFS61860.2024.10810705%3C%5C%2Fa%3E.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22DPGEM%3A%20Differentially%20Private%20Generative%20Model%20with%20Exponential%20Mechanism%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Usman%22%2C%22lastName%22%3A%22Hassan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jiangyue%22%2C%22lastName%22%3A%22Zhu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dongjie%22%2C%22lastName%22%3A%22Chen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sen-Ching%20Samson%22%2C%22lastName%22%3A%22Cheung%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-12-2%22%2C%22proceedingsTitle%22%3A%222024%20IEEE%20International%20Workshop%20on%20Information%20Forensics%20and%20Security%20%28WIFS%29%22%2C%22conferenceName%22%3A%222024%20IEEE%20International%20Workshop%20on%20Information%20Forensics%20and%20Security%20%28WIFS%29%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FWIFS61860.2024.10810705%22%2C%22ISBN%22%3A%229798350364422%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10810705%5C%2F%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A07%3A53Z%22%7D%7D%2C%7B%22key%22%3A%22HZ73E643%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Padmanabha%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EPadmanabha%2C%20G.%20A.%2C%20Safta%2C%20C.%2C%20Bouklas%2C%20N.%20%26amp%3B%20Jones%2C%20R.%20E.%20Condensed%20Stein%20Variational%20Gradient%20Descent%20for%20Uncertainty%20Quantification%20of%20Neural%20Networks.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.16462%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.16462%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Condensed%20Stein%20Variational%20Gradient%20Descent%20for%20Uncertainty%20Quantification%20of%20Neural%20Networks%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Govinda%20Anantha%22%2C%22lastName%22%3A%22Padmanabha%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cosmin%22%2C%22lastName%22%3A%22Safta%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nikolaos%22%2C%22lastName%22%3A%22Bouklas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Reese%20E.%22%2C%22lastName%22%3A%22Jones%22%7D%5D%2C%22abstractNote%22%3A%22We%20propose%20a%20Stein%20variational%20gradient%20descent%20method%20to%20concurrently%20sparsify%2C%20train%2C%20and%20provide%20uncertainty%20quantification%20of%20a%20complexly%20parameterized%20model%20such%20as%20a%20neural%20network.%20It%20employs%20a%20graph%20reconciliation%20and%20condensation%20process%20to%20reduce%20complexity%20and%20increase%20similarity%20in%20the%20Stein%20ensemble%20of%20parameterizations.%20Therefore%2C%20the%20proposed%20condensed%20Stein%20variational%20gradient%20%28cSVGD%29%20method%20provides%20uncertainty%20quantification%20on%20parameters%2C%20not%20just%20outputs.%20Furthermore%2C%20the%20parameter%20reduction%20speeds%20up%20the%20convergence%20of%20the%20Stein%20gradient%20descent%20as%20it%20reduces%20the%20combinatorial%20complexity%20by%20aligning%20and%20differentiating%20the%20sensitivity%20to%20parameters.%20These%20properties%20are%20demonstrated%20with%20an%20illustrative%20example%20and%20an%20application%20to%20a%20representation%20problem%20in%20solid%20mechanics.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2412.16462%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2412.16462%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A06%3A57Z%22%7D%7D%2C%7B%22key%22%3A%22TXI6W5H5%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Brandt%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EBrandt%2C%20P.%20T.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20ConfliBERT%3A%20A%20Language%20Model%20for%20Political%20Conflict.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.15060%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.15060%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22ConfliBERT%3A%20A%20Language%20Model%20for%20Political%20Conflict%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Patrick%20T.%22%2C%22lastName%22%3A%22Brandt%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sultan%22%2C%22lastName%22%3A%22Alsarra%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Vito%20J.%22%2C%22lastName%22%3A%22D%60Orazio%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dagmar%22%2C%22lastName%22%3A%22Heintze%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Latifur%22%2C%22lastName%22%3A%22Khan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shreyas%22%2C%22lastName%22%3A%22Meher%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Javier%22%2C%22lastName%22%3A%22Osorio%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marcus%22%2C%22lastName%22%3A%22Sianan%22%7D%5D%2C%22abstractNote%22%3A%22Conflict%20scholars%20have%20used%20rule-based%20approaches%20to%20extract%20information%20about%20political%20violence%20from%20news%20reports%20and%20texts.%20Recent%20Natural%20Language%20Processing%20developments%20move%20beyond%20rigid%20rule-based%20approaches.%20We%20review%20our%20recent%20ConfliBERT%20language%20model%20%28Hu%20et%20al.%202022%29%20to%20process%20political%20and%20violence%20related%20texts.%20The%20model%20can%20be%20used%20to%20extract%20actor%20and%20action%20classifications%20from%20texts%20about%20political%20conflict.%20When%20fine-tuned%2C%20results%20show%20that%20ConfliBERT%20has%20superior%20performance%20in%20accuracy%2C%20precision%20and%20recall%20over%20other%20large%20language%20models%20%28LLM%29%20like%20Google%27s%20Gemma%202%20%289B%29%2C%20Meta%27s%20Llama%203.1%20%287B%29%2C%20and%20Alibaba%27s%20Qwen%202.5%20%2814B%29%20within%20its%20relevant%20domains.%20It%20is%20also%20hundreds%20of%20times%20faster%20than%20these%20more%20generalist%20LLMs.%20These%20results%20are%20illustrated%20using%20texts%20from%20the%20BBC%2C%20re3d%2C%20and%20the%20Global%20Terrorism%20Dataset%20%28GTD%29.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2412.15060%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2412.15060%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A06%3A42Z%22%7D%7D%2C%7B%22key%22%3A%2284Z6BCFX%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Modesitt%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EModesitt%2C%20E.%2C%20Yang%2C%20K.%2C%20Hulsey%2C%20S.%2C%20Zhai%2C%20C.%20%26amp%3B%20Kindratenko%2C%20V.%20ORBIT%3A%20Cost-Effective%20Dataset%20Curation%20for%20Large%20Language%20Model%20Domain%20Adaptation%20with%20an%20Astronomy%20Case%20Study.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.14436%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.14436%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22ORBIT%3A%20Cost-Effective%20Dataset%20Curation%20for%20Large%20Language%20Model%20Domain%20Adaptation%20with%20an%20Astronomy%20Case%20Study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eric%22%2C%22lastName%22%3A%22Modesitt%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ke%22%2C%22lastName%22%3A%22Yang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Spencer%22%2C%22lastName%22%3A%22Hulsey%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chengxiang%22%2C%22lastName%22%3A%22Zhai%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Volodymyr%22%2C%22lastName%22%3A%22Kindratenko%22%7D%5D%2C%22abstractNote%22%3A%22Recent%20advances%20in%20language%20modeling%20demonstrate%20the%20need%20for%20high-quality%20domain-specific%20training%20data%2C%20especially%20for%20tasks%20that%20require%20specialized%20knowledge.%20General-purpose%20models%2C%20while%20versatile%2C%20often%20lack%20the%20depth%20needed%20for%20expert-level%20tasks%20because%20of%20limited%20domain-specific%20information.%20Domain%20adaptation%20training%20can%20enhance%20these%20models%2C%20but%20it%20demands%20substantial%2C%20high-quality%20data.%20To%20address%20this%2C%20we%20propose%20ORBIT%2C%20a%20cost-efficient%20methodology%20for%20curating%20massive%2C%20high-quality%20domain-specific%20datasets%20from%20noisy%20web%20sources%2C%20tailored%20for%20training%20specialist%20large%20language%20models.%20Using%20astronomy%20as%20a%20primary%20case%20study%2C%20we%20refined%20the%201.3T-token%20FineWeb-Edu%20dataset%20into%20a%20high-quality%2C%2010B-token%20subset%20focused%20on%20astronomy.%20Fine-tuning%20%5C%5Ctextsc%7BLLaMA-3-8B%7D%20on%20a%201B-token%20astronomy%20subset%20improved%20performance%20on%20the%20MMLU%20astronomy%20benchmark%20from%2069%5C%5C%25%20to%2076%5C%5C%25%20and%20achieved%20top%20results%20on%20AstroBench%2C%20an%20astronomy-specific%20benchmark.%20Moreover%2C%20our%20model%20%28Orbit-LLaMA%29%20outperformed%20%5C%5Ctextsc%7BLLaMA-3-8B-base%7D%2C%20with%20GPT-4o%20evaluations%20preferring%20it%20in%2073%5C%5C%25%20of%20cases%20across%201000%20astronomy-specific%20questions.%20Additionally%2C%20we%20validated%20ORBIT%27s%20generalizability%20by%20applying%20it%20to%20law%20and%20medicine%2C%20achieving%20a%20significant%20improvement%20of%20data%20quality%20compared%20to%20an%20unfiltered%20baseline.%20We%20open-source%20the%20ORBIT%20methodology%2C%20including%20the%20curated%20datasets%2C%20the%20codebase%2C%20and%20the%20resulting%20model%20at%20%5C%5Chref%7Bhttps%3A%5C%2F%5C%2Fgithub.com%5C%2FModeEric%5C%2FORBIT-Llama%7D%7Bhttps%3A%5C%2F%5C%2Fgithub.com%5C%2FModeEric%5C%2FORBIT-Llama%7D.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2412.14436%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2412.14436%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A06%3A19Z%22%7D%7D%2C%7B%22key%22%3A%22YGSQETRS%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Xu%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EXu%2C%20Z.%2C%20Yan%2C%20J.%2C%20Gupta%2C%20A.%20%26amp%3B%20Srikumar%2C%20V.%20State%20Space%20Models%20are%20Strong%20Text%20Rerankers.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.14354%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.14354%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22State%20Space%20Models%20are%20Strong%20Text%20Rerankers%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zhichao%22%2C%22lastName%22%3A%22Xu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jinghua%22%2C%22lastName%22%3A%22Yan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ashim%22%2C%22lastName%22%3A%22Gupta%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Vivek%22%2C%22lastName%22%3A%22Srikumar%22%7D%5D%2C%22abstractNote%22%3A%22Transformers%20dominate%20NLP%20and%20IR%3B%20but%20their%20inference%20inefficiencies%20and%20challenges%20in%20extrapolating%20to%20longer%20contexts%20have%20sparked%20interest%20in%20alternative%20model%20architectures.%20Among%20these%2C%20state%20space%20models%20%28SSMs%29%20like%20Mamba%20offer%20promising%20advantages%2C%20particularly%20%24O%281%29%24%20time%20complexity%20in%20inference.%20Despite%20their%20potential%2C%20SSMs%27%20effectiveness%20at%20text%20reranking%20--%20a%20task%20requiring%20fine-grained%20query-document%20interaction%20and%20long-context%20understanding%20--%20remains%20underexplored.%5Cn%20This%20study%20benchmarks%20SSM-based%20architectures%20%28specifically%2C%20Mamba-1%20and%20Mamba-2%29%20against%20transformer-based%20models%20across%20various%20scales%2C%20architectures%2C%20and%20pre-training%20objectives%2C%20focusing%20on%20performance%20and%20efficiency%20in%20text%20reranking%20tasks.%20We%20find%20that%20%281%29%20Mamba%20architectures%20achieve%20competitive%20text%20ranking%20performance%2C%20comparable%20to%20transformer-based%20models%20of%20similar%20size%3B%20%282%29%20they%20are%20less%20efficient%20in%20training%20and%20inference%20compared%20to%20transformers%20with%20flash%20attention%3B%20and%20%283%29%20Mamba-2%20outperforms%20Mamba-1%20in%20both%20performance%20and%20efficiency.%20These%20results%20underscore%20the%20potential%20of%20state%20space%20models%20as%20a%20transformer%20alternative%20and%20highlight%20areas%20for%20improvement%20in%20future%20IR%20applications.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2412.14354%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2412.14354%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A05%3A54Z%22%7D%7D%2C%7B%22key%22%3A%22TCYRXMTF%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Fukami%20and%20Taira%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EFukami%2C%20K.%20%26amp%3B%20Taira%2C%20K.%20Single-snapshot%20machine%20learning%20for%20super-resolution%20of%20turbulence.%20%282024%29%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27http%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2409.04923%27%3Ehttp%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2409.04923%3C%5C%2Fa%3E.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Single-snapshot%20machine%20learning%20for%20super-resolution%20of%20turbulence%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kai%22%2C%22lastName%22%3A%22Fukami%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kunihiko%22%2C%22lastName%22%3A%22Taira%22%7D%5D%2C%22abstractNote%22%3A%22Modern%20machine-learning%20techniques%20are%20generally%20considered%20data-hungry.%20However%2C%20this%20may%20not%20be%20the%20case%20for%20turbulence%20as%20each%20of%20its%20snapshots%20can%20hold%20more%20information%20than%20a%20single%20data%20file%20in%20general%20machine-learning%20settings.%20This%20study%20asks%20the%20question%20of%20whether%20nonlinear%20machine-learning%20techniques%20can%20effectively%20extract%20physical%20insights%20even%20from%20as%20little%20as%20a%20%7B%5C%5Cit%20single%7D%20snapshot%20of%20turbulent%20flow.%20As%20an%20example%2C%20we%20consider%20machine-learning-based%20super-resolution%20analysis%20that%20reconstructs%20a%20high-resolution%20field%20from%20low-resolution%20data%20for%20two%20examples%20of%20two-dimensional%20isotropic%20turbulence%20and%20three-dimensional%20turbulent%20channel%20flow.%20First%2C%20we%20reveal%20that%20a%20carefully%20designed%20machine-learning%20model%20trained%20with%20flow%20tiles%20sampled%20from%20only%20a%20single%20snapshot%20can%20reconstruct%20vortical%20structures%20across%20a%20range%20of%20Reynolds%20numbers%20for%20two-dimensional%20decaying%20turbulence.%20Successful%20flow%20reconstruction%20indicates%20that%20nonlinear%20machine-learning%20techniques%20can%20leverage%20scale-invariance%20properties%20to%20learn%20turbulent%20flows.%20We%20also%20show%20that%20training%20data%20of%20turbulent%20flows%20can%20be%20cleverly%20collected%20from%20a%20single%20snapshot%20by%20considering%20characteristics%20of%20rotation%20and%20shear%20tensors.%20Second%2C%20we%20perform%20the%20single-snapshot%20super-resolution%20analysis%20for%20turbulent%20channel%20flow%2C%20showing%20that%20it%20is%20possible%20to%20extract%20physical%20insights%20from%20a%20single%20flow%20snapshot%20even%20with%20inhomogeneity.%20The%20present%20findings%20suggest%20that%20embedding%20prior%20knowledge%20in%20designing%20a%20model%20and%20collecting%20data%20is%20important%20for%20a%20range%20of%20data-driven%20analyses%20for%20turbulent%20flows.%20More%20broadly%2C%20this%20work%20hopes%20to%20stop%20machine-learning%20practitioners%20from%20being%20wasteful%20with%20turbulent%20flow%20data.%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2409.04923%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2409.04923%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A05%3A39Z%22%7D%7D%2C%7B%22key%22%3A%226SKHYBAD%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kacmaz%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EKacmaz%2C%20S.%2C%20Haas%2C%20R.%20%26amp%3B%20Huerta%2C%20E.%20A.%20Machine%20learning-driven%20conservative-to-primitive%20conversion%20in%20hybrid%20piecewise%20polytropic%20and%20tabulated%20equations%20of%20state.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.07836%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.07836%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Machine%20learning-driven%20conservative-to-primitive%20conversion%20in%20hybrid%20piecewise%20polytropic%20and%20tabulated%20equations%20of%20state%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Semih%22%2C%22lastName%22%3A%22Kacmaz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Roland%22%2C%22lastName%22%3A%22Haas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22E.%20A.%22%2C%22lastName%22%3A%22Huerta%22%7D%5D%2C%22abstractNote%22%3A%22We%20present%20a%20novel%20machine%20learning%20%28ML%29%20method%20to%20accelerate%20conservative-to-primitive%20inversion%2C%20focusing%20on%20hybrid%20piecewise%20polytropic%20and%20tabulated%20equations%20of%20state.%20Traditional%20root-finding%20techniques%20are%20computationally%20expensive%2C%20particularly%20for%20large-scale%20relativistic%20hydrodynamics%20simulations.%20To%20address%20this%2C%20we%20employ%20feedforward%20neural%20networks%20%28NNC2PS%20and%20NNC2PL%29%2C%20trained%20in%20PyTorch%20and%20optimized%20for%20GPU%20inference%20using%20NVIDIA%20TensorRT%2C%20achieving%20significant%20speedups%20with%20minimal%20accuracy%20loss.%20The%20NNC2PS%20model%20achieves%20%24%20L_1%20%24%20and%20%24%20L_%5C%5Cinfty%20%24%20errors%20of%20%24%204.54%20%5C%5Ctimes%2010%5E%7B-7%7D%20%24%20and%20%24%203.44%20%5C%5Ctimes%2010%5E%7B-6%7D%20%24%2C%20respectively%2C%20while%20the%20NNC2PL%20model%20exhibits%20even%20lower%20error%20values.%20TensorRT%20optimization%20with%20mixed-precision%20deployment%20substantially%20accelerates%20performance%20compared%20to%20traditional%20root-finding%20methods.%20Specifically%2C%20the%20mixed-precision%20TensorRT%20engine%20for%20NNC2PS%20achieves%20inference%20speeds%20approximately%20400%20times%20faster%20than%20a%20traditional%20single-threaded%20CPU%20implementation%20for%20a%20dataset%20size%20of%201%2C000%2C000%20points.%20Ideal%20parallelization%20across%20an%20entire%20compute%20node%20in%20the%20Delta%20supercomputer%20%28Dual%20AMD%2064%20core%202.45%20GHz%20Milan%20processors%3B%20and%208%20NVIDIA%20A100%20GPUs%20with%2040%20GB%20HBM2%20RAM%20and%20NVLink%29%20predicts%20a%2025-fold%20speedup%20for%20TensorRT%20over%20an%20optimally-parallelized%20numerical%20method%20when%20processing%208%20million%20data%20points.%20Moreover%2C%20the%20ML%20method%20exhibits%20sub-linear%20scaling%20with%20increasing%20dataset%20sizes.%20We%20release%20the%20scientific%20software%20developed%2C%20enabling%20further%20validation%20and%20extension%20of%20our%20findings.%20This%20work%20underscores%20the%20potential%20of%20ML%2C%20combined%20with%20GPU%20optimization%20and%20model%20quantization%2C%20to%20accelerate%20conservative-to-primitive%20inversion%20in%20relativistic%20hydrodynamics%20simulations.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2412.07836%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2412.07836%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A05%3A12Z%22%7D%7D%2C%7B%22key%22%3A%22LRVF7CXZ%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Mark%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EMark%2C%20M.%20S.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Policy%20Agnostic%20RL%3A%20Offline%20RL%20and%20Online%20RL%20Fine-Tuning%20of%20Any%20Class%20and%20Backbone.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.06685%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.06685%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Policy%20Agnostic%20RL%3A%20Offline%20RL%20and%20Online%20RL%20Fine-Tuning%20of%20Any%20Class%20and%20Backbone%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Max%20Sobol%22%2C%22lastName%22%3A%22Mark%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tian%22%2C%22lastName%22%3A%22Gao%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Georgia%20Gabriela%22%2C%22lastName%22%3A%22Sampaio%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mohan%20Kumar%22%2C%22lastName%22%3A%22Srirama%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Archit%22%2C%22lastName%22%3A%22Sharma%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chelsea%22%2C%22lastName%22%3A%22Finn%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aviral%22%2C%22lastName%22%3A%22Kumar%22%7D%5D%2C%22abstractNote%22%3A%22Recent%20advances%20in%20learning%20decision-making%20policies%20can%20largely%20be%20attributed%20to%20training%20expressive%20policy%20models%2C%20largely%20via%20imitation%20learning.%20While%20imitation%20learning%20discards%20non-expert%20data%2C%20reinforcement%20learning%20%28RL%29%20can%20still%20learn%20from%20suboptimal%20data.%20However%2C%20instantiating%20RL%20training%20of%20a%20new%20policy%20class%20often%20presents%20a%20different%20challenge%3A%20most%20deep%20RL%20machinery%20is%20co-developed%20with%20assumptions%20on%20the%20policy%20class%20and%20backbone%2C%20resulting%20in%20poor%20performance%20when%20the%20policy%20class%20changes.%20For%20instance%2C%20SAC%20utilizes%20a%20low-variance%20reparameterization%20policy%20gradient%20for%20Gaussian%20policies%2C%20but%20this%20is%20unstable%20for%20diffusion%20policies%20and%20intractable%20for%20autoregressive%20categorical%20policies.%20To%20address%20this%20issue%2C%20we%20develop%20an%20offline%20RL%20and%20online%20fine-tuning%20approach%20called%20policy-agnostic%20RL%20%28PA-RL%29%20that%20can%20effectively%20train%20multiple%20policy%20classes%2C%20with%20varying%20architectures%20and%20sizes.%20We%20build%20off%20the%20basic%20idea%20that%20a%20universal%20supervised%20learning%20loss%20can%20replace%20the%20policy%20improvement%20step%20in%20RL%2C%20as%20long%20as%20it%20is%20applied%20on%20%5C%22optimized%5C%22%20actions.%20To%20obtain%20these%20optimized%20actions%2C%20we%20first%20sample%20multiple%20actions%20from%20a%20base%20policy%2C%20and%20run%20global%20optimization%20%28i.e.%2C%20re-ranking%20multiple%20action%20samples%20using%20the%20Q-function%29%20and%20local%20optimization%20%28i.e.%2C%20running%20gradient%20steps%20on%20an%20action%20sample%29%20to%20maximize%20the%20critic%20on%20these%20candidates.%20PA-RL%20enables%20fine-tuning%20diffusion%20and%20transformer%20policies%20with%20either%20autoregressive%20tokens%20or%20continuous%20action%20outputs%2C%20at%20different%20sizes%2C%20entirely%20via%20actor-critic%20RL.%20Moreover%2C%20PA-RL%20improves%20the%20performance%20and%20sample-efficiency%20by%20up%20to%202%20times%20compared%20to%20existing%20offline%20RL%20and%20online%20fine-tuning%20methods.%20We%20show%20the%20first%20result%20that%20successfully%20fine-tunes%20OpenVLA%2C%20a%207B%20generalist%20robot%20policy%2C%20autonomously%20with%20Cal-QL%2C%20an%20online%20RL%20fine-tuning%20algorithm%2C%20improving%20from%2040%25%20to%2070%25%20in%20the%20real%20world%20in%2040%20minutes.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2412.06685%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2412.06685%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A04%3A56Z%22%7D%7D%2C%7B%22key%22%3A%227YD6RC7I%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kim%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EKim%2C%20Y.%2C%20Most%2C%20E.%20R.%2C%20Beloborodov%2C%20A.%20M.%20%26amp%3B%20Ripperda%2C%20B.%20Black%20hole%20pulsars%20and%20monster%20shocks%20as%20outcomes%20of%20black%20hole-neutron%20star%20mergers.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.05760%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.05760%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Black%20hole%20pulsars%20and%20monster%20shocks%20as%20outcomes%20of%20black%20hole-neutron%20star%20mergers%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yoonsoo%22%2C%22lastName%22%3A%22Kim%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elias%20R.%22%2C%22lastName%22%3A%22Most%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andrei%20M.%22%2C%22lastName%22%3A%22Beloborodov%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bart%22%2C%22lastName%22%3A%22Ripperda%22%7D%5D%2C%22abstractNote%22%3A%22The%20merger%20of%20a%20black%20hole%20%28BH%29%20and%20a%20neutron%20star%20%28NS%29%20in%20most%20cases%20is%20expected%20to%20leave%20no%20material%20around%20the%20remnant%20BH%3B%20therefore%2C%20such%20events%20are%20often%20considered%20as%20sources%20of%20gravitational%20waves%20without%20electromagnetic%20counterparts.%20However%2C%20a%20bright%20counterpart%20can%20emerge%20if%20the%20NS%20is%20strongly%20magnetized%2C%20as%20its%20external%20magnetosphere%20can%20experience%20radiative%20shocks%20and%20magnetic%20reconnection%20during%5C%2Fafter%20the%20merger.%20We%20use%20magnetohydrodynamic%20simulations%20in%20the%20dynamical%20spacetime%20of%20a%20merging%20BH-NS%20binary%20to%20investigate%20its%20magnetospheric%20dynamics.%20We%20find%20that%20the%20magnetosphere%20develops%20compressive%20waves%20that%20steepen%20into%20shocks.%20After%20swallowing%20the%20NS%2C%20the%20BH%20acquires%20a%20magnetosphere%20that%20quickly%20evolves%20into%20a%20split%20monopole%20configuration%20and%20then%20undergoes%20an%20exponential%20decay%20%28balding%29%2C%20enabled%20by%20magnetic%20reconnection%20and%20also%20assisted%20by%20the%20ring-down%20of%20the%20remnant%20BH.%20This%20spinning%20BH%20drags%20the%20split%20monopole%20into%20rotation%2C%20forming%20a%20transient%20pulsar-like%20state.%20It%20emits%20a%20striped%20wind%20if%20the%20swallowed%20magnetic%20dipole%20moment%20is%20inclined%20to%20the%20spin%20axis.%20We%20predict%20two%20types%20of%20transients%20from%20this%20scenario%3A%20%281%29%20a%20fast%20radio%20burst%20emitted%20by%20the%20shocks%20as%20they%20expand%20to%20large%20radii%20and%20%282%29%20an%20X%5C%2Fgamma-ray%20burst%20emitted%20by%20the%20%24e%5E%5C%5Cpm%24%20outflow%20heated%20by%20magnetic%20dissipation.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2412.05760%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2412.05760%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T16%3A04%3A39Z%22%7D%7D%2C%7B%22key%22%3A%22XCNY2UGA%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chen%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EChen%2C%20P.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Learning%20a%20Filtered%20Backprojection%20Reconstruction%20Method%20for%20Photoacoustic%20Computed%20Tomography%20with%20Hemispherical%20Measurement%20Geometries.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.01971%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.01971%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Learning%20a%20Filtered%20Backprojection%20Reconstruction%20Method%20for%20Photoacoustic%20Computed%20Tomography%20with%20Hemispherical%20Measurement%20Geometries%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Panpan%22%2C%22lastName%22%3A%22Chen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Seonyeong%22%2C%22lastName%22%3A%22Park%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Refik%20Mert%22%2C%22lastName%22%3A%22Cam%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hsuan-Kai%22%2C%22lastName%22%3A%22Huang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexander%20A.%22%2C%22lastName%22%3A%22Oraevsky%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Umberto%22%2C%22lastName%22%3A%22Villa%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mark%20A.%22%2C%22lastName%22%3A%22Anastasio%22%7D%5D%2C%22abstractNote%22%3A%22In%20certain%20three-dimensional%20%283D%29%20applications%20of%20photoacoustic%20computed%20tomography%20%28PACT%29%2C%20including%20%5C%5Ctextit%7Bin%20vivo%7D%20breast%20imaging%2C%20hemispherical%20measurement%20apertures%20that%20enclose%20the%20object%20within%20their%20convex%20hull%20are%20employed%20for%20data%20acquisition.%20Data%20acquired%20with%20such%20measurement%20geometries%20are%20referred%20to%20as%20%5C%5Ctextit%7Bhalf-scan%7D%20data%2C%20as%20only%20half%20of%20a%20complete%20spherical%20measurement%20aperture%20is%20employed.%20Although%20previous%20studies%20have%20demonstrated%20that%20half-scan%20data%20can%20uniquely%20and%20stably%20reconstruct%20the%20sought-after%20object%2C%20no%20closed-form%20reconstruction%20formula%20for%20use%20with%20half-scan%20data%20has%20been%20reported.%20To%20address%20this%2C%20a%20semi-analytic%20reconstruction%20method%20in%20the%20form%20of%20filtered%20backprojection%20%28FBP%29%2C%20referred%20to%20as%20the%20half-scan%20FBP%20method%2C%20is%20developed%20in%20this%20work.%20Because%20the%20explicit%20form%20of%20the%20filtering%20operation%20in%20the%20half-scan%20FBP%20method%20is%20not%20currently%20known%2C%20a%20learning-based%20method%20is%20proposed%20to%20approximate%20it.%20The%20proposed%20method%20is%20systematically%20investigated%20by%20use%20of%20virtual%20imaging%20studies%20of%203D%20breast%20PACT%20that%20employ%20ensembles%20of%20numerical%20breast%20phantoms%20and%20a%20physics-based%20model%20of%20the%20data%20acquisition%20process.%20The%20method%20is%20subsequently%20applied%20to%20experimental%20data%20acquired%20in%20an%20%5C%5Ctextit%7Bin%20vivo%7D%20breast%20PACT%20study.%20The%20results%20confirm%20that%20the%20half-scan%20FBP%20method%20can%20accurately%20reconstruct%203D%20images%20from%20half-scan%20data.%20Importantly%2C%20because%20the%20sought-after%20inverse%20mapping%20is%20well-posed%2C%20the%20reconstruction%20method%20remains%20accurate%20even%20when%20applied%20to%20data%20that%20differ%20considerably%20from%20those%20employed%20to%20learn%20the%20filtering%20operation.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2412.01971%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2412.01971%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T15%3A44%3A27Z%22%7D%7D%2C%7B%22key%22%3A%22RTG4TZSQ%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kobayashi%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EKobayashi%2C%20K.%2C%20Ahmed%2C%20F.%20%26amp%3B%20Alam%2C%20S.%20B.%20Virtual%20Sensing%20to%20Enable%20Real-Time%20Monitoring%20of%20Inaccessible%20Locations%20%5C%5C%26amp%3Bamp%3B%20Unmeasurable%20Parameters.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.00107%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2412.00107%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Virtual%20Sensing%20to%20Enable%20Real-Time%20Monitoring%20of%20Inaccessible%20Locations%20%5C%5C%26amp%3B%20Unmeasurable%20Parameters%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kazuma%22%2C%22lastName%22%3A%22Kobayashi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Farid%22%2C%22lastName%22%3A%22Ahmed%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Syed%20Bahauddin%22%2C%22lastName%22%3A%22Alam%22%7D%5D%2C%22abstractNote%22%3A%22Real-time%20monitoring%20of%20critical%20parameters%20is%20essential%20for%20energy%20systems%27%20safe%20and%20efficient%20operation.%20However%2C%20traditional%20sensors%20often%20fail%20and%20degrade%20in%20harsh%20environments%20where%20physical%20sensors%20cannot%20be%20placed%20%28inaccessible%20locations%29.%20In%20addition%2C%20there%20are%20important%20parameters%20that%20cannot%20be%20directly%20measured%20by%20sensors.%20We%20need%20machine%20learning%20%28ML%29-based%20real-time%20monitoring%20in%20those%20remote%20locations%20to%20ensure%20system%20operations.%20However%2C%20traditional%20ML%20models%20struggle%20to%20process%20continuous%20sensor%20profile%20data%20to%20fit%20model%20requirements%2C%20leading%20to%20the%20loss%20of%20spatial%20relationships.%20Another%20challenge%20for%20real-time%20monitoring%20is%20%60%60dataset%20shift%5C%22%20and%20the%20need%20for%20frequent%20retraining%20under%20varying%20conditions%2C%20where%20extensive%20retraining%20prohibits%20real-time%20inference.%20To%20resolve%20these%20challenges%2C%20this%20study%20addressed%20the%20limitations%20of%20real-time%20monitoring%20methods%20by%20enabling%20monitoring%20in%20locations%20where%20physical%20sensors%20are%20impractical%20to%20deploy.%20Our%20proposed%20approach%2C%20utilizing%20Multi-Input%20Operator%20Network%20virtual%20sensors%2C%20leverages%20deep%20learning%20to%20seamlessly%20integrate%20diverse%20data%20sources%20and%20accurately%20predict%20key%20parameters%20in%20real-time%20without%20the%20need%20for%20additional%20physical%20sensors.%20The%20approach%27s%20effectiveness%20is%20demonstrated%20through%20thermal-hydraulic%20monitoring%20in%20a%20nuclear%20reactor%20subchannel%2C%20achieving%20remarkable%20accuracy.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2412.00107%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2412.00107%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222025-02-04T15%3A44%3A07Z%22%7D%7D%2C%7B%22key%22%3A%22947N8VAJ%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Prakash%20et%20al.%22%2C%22parsedDate%22%3A%222023%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EPrakash%2C%20A.%2C%20Chang%2C%20M.%2C%20Jin%2C%20M.%2C%20Tu%2C%20R.%20%26amp%3B%20Gupta%2C%20S.%203D%20Reconstruction%20of%20Objects%20in%20Hands%20without%20Real%20World%203D%20Supervision.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2305.03036%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2305.03036%3C%5C%2Fa%3E%20%282023%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%223D%20Reconstruction%20of%20Objects%20in%20Hands%20without%20Real%20World%203D%20Supervision%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aditya%22%2C%22lastName%22%3A%22Prakash%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Matthew%22%2C%22lastName%22%3A%22Chang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Matthew%22%2C%22lastName%22%3A%22Jin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ruisen%22%2C%22lastName%22%3A%22Tu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Saurabh%22%2C%22lastName%22%3A%22Gupta%22%7D%5D%2C%22abstractNote%22%3A%22Prior%20works%20for%20reconstructing%20hand-held%20objects%20from%20a%20single%20image%20train%20models%20on%20images%20paired%20with%203D%20shapes.%20Such%20data%20is%20challenging%20to%20gather%20in%20the%20real%20world%20at%20scale.%20Consequently%2C%20these%20approaches%20do%20not%20generalize%20well%20when%20presented%20with%20novel%20objects%20in%20in-the-wild%20settings.%20While%203D%20supervision%20is%20a%20major%20bottleneck%2C%20there%20is%20an%20abundance%20of%20a%29%20in-the-wild%20raw%20video%20data%20showing%20hand-object%20interactions%20and%20b%29%20synthetic%203D%20shape%20collections.%20In%20this%20paper%2C%20we%20propose%20modules%20to%20leverage%203D%20supervision%20from%20these%20sources%20to%20scale%20up%20the%20learning%20of%20models%20for%20reconstructing%20hand-held%20objects.%20Specifically%2C%20we%20extract%20multiview%202D%20mask%20supervision%20from%20videos%20and%203D%20shape%20priors%20from%20shape%20collections.%20We%20use%20these%20indirect%203D%20cues%20to%20train%20occupancy%20networks%20that%20predict%20the%203D%20shape%20of%20objects%20from%20a%20single%20RGB%20image.%20Our%20experiments%20in%20the%20challenging%20object%20generalization%20setting%20on%20in-the-wild%20MOW%20dataset%20show%2011.6%25%20relative%20improvement%20over%20models%20trained%20with%203D%20supervision%20on%20existing%20datasets.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222023%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2305.03036%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2305.03036%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-12-04T19%3A46%3A46Z%22%7D%7D%2C%7B%22key%22%3A%22TFTZC75S%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Abbasi%20and%20Mehdizadeh%22%2C%22parsedDate%22%3A%222024-11-01%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EAbbasi%2C%20S.%20%26amp%3B%20Mehdizadeh%2C%20A.%20On%20the%20interplay%20between%20fluid%20flow%20characteristics%20and%20small%20particle%20deposition%20in%20turbulent%20wall%20bounded%20flows.%20%3Ci%3EPhysics%20of%20Fluids%3C%5C%2Fi%3E%20%3Cb%3E36%3C%5C%2Fb%3E%2C%20113305%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22On%20the%20interplay%20between%20fluid%20flow%20characteristics%20and%20small%20particle%20deposition%20in%20turbulent%20wall%20bounded%20flows%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sanaz%22%2C%22lastName%22%3A%22Abbasi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amirfarhang%22%2C%22lastName%22%3A%22Mehdizadeh%22%7D%5D%2C%22abstractNote%22%3A%22This%20study%20investigates%20the%20transport%20and%20deposition%20of%20small%20particles%20%28500%5Cu2009nm%5Cu2264dp%5Cu226410%5Cu2009%5Cu03bcm%29%20in%20a%20fully%20developed%20turbulent%20channel%20flow%2C%20focusing%20on%20two%20fluid%20friction%20Reynolds%20numbers%3A%20Re%5Cu03c4%3D180%20and%20Re%5Cu03c4%3D1000.%20Using%20the%20point%20particle%5Cu2013direct%20numerical%20simulation%20method%20under%20the%20assumption%20of%20one-way%20coupling%2C%20we%20study%20how%20fluid%20flow%20%28carrier%20phase%29%20characteristics%20influence%20particle%20deposition.%20Our%20findings%20suggest%20that%20changes%20in%20flow%20conditions%20can%20significantly%20alter%20the%20deposition%20behavior%20of%20particles%20with%20the%20same%20size%20and%20properties.%20Furthermore%2C%20we%20show%20for%20the%20first%20time%20that%20gravity%20has%20minimal%20impact%20on%20deposition%20dynamics%20only%20at%20high%20Reynolds%20numbers.%20This%20research%20enhances%20our%20understanding%20of%20small%20particle%20deposition%20and%20transport%20in%20turbulent%20flows%20at%20high%20Reynolds%20numbers%2C%20which%20is%20crucial%20for%20various%20industrial%20applications.%22%2C%22date%22%3A%222024-11-01%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1063%5C%2F5.0232440%22%2C%22ISSN%22%3A%221070-6631%2C%201089-7666%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fpubs.aip.org%5C%2Fpof%5C%2Farticle%5C%2F36%5C%2F11%5C%2F113305%5C%2F3318585%5C%2FOn-the-interplay-between-fluid-flow%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-12-04T18%3A33%3A29Z%22%7D%7D%2C%7B%22key%22%3A%22AHHP4SRK%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22You%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EYou%2C%20D.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Inverse%20design%20of%20short-range%20order%20arrangement%20via%20neural%20network.%20%3Ci%3EInternational%20Journal%20of%20Solids%20and%20Structures%3C%5C%2Fi%3E%20113175%20%282024%29%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27http%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.ijsolstr.2024.113175%27%3Ehttp%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.ijsolstr.2024.113175%3C%5C%2Fa%3E.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Inverse%20design%20of%20short-range%20order%20arrangement%20via%20neural%20network%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Daegun%22%2C%22lastName%22%3A%22You%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Orcun%20Koray%22%2C%22lastName%22%3A%22Celebi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Diab%20W.%22%2C%22lastName%22%3A%22Abueidda%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gorkem%22%2C%22lastName%22%3A%22Gengor%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ahmed%20Sameer%22%2C%22lastName%22%3A%22Khan%20Mohammed%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Seid%22%2C%22lastName%22%3A%22Koric%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Huseyin%22%2C%22lastName%22%3A%22Sehitoglu%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%2211%5C%2F2024%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.ijsolstr.2024.113175%22%2C%22ISSN%22%3A%2200207683%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flinkinghub.elsevier.com%5C%2Fretrieve%5C%2Fpii%5C%2FS0020768324005341%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-12-02T17%3A40%3A02Z%22%7D%7D%2C%7B%22key%22%3A%22532FEQLT%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Sharma%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ESharma%2C%20A.%2C%20Ding%2C%20H.%2C%20Li%2C%20J.%2C%20Dani%2C%20N.%20%26amp%3B%20Zhang%2C%20M.%20MiniKV%3A%20Pushing%20the%20Limits%20of%20LLM%20Inference%20via%202-Bit%20Layer-Discriminative%20KV%20Cache.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.18077%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.18077%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22MiniKV%3A%20Pushing%20the%20Limits%20of%20LLM%20Inference%20via%202-Bit%20Layer-Discriminative%20KV%20Cache%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Akshat%22%2C%22lastName%22%3A%22Sharma%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hangliang%22%2C%22lastName%22%3A%22Ding%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jianping%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Neel%22%2C%22lastName%22%3A%22Dani%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Minjia%22%2C%22lastName%22%3A%22Zhang%22%7D%5D%2C%22abstractNote%22%3A%22How%20to%20efficiently%20serve%20LLMs%20in%20practice%20has%20become%20exceptionally%20challenging%20due%20to%20their%20prohibitive%20memory%20and%20computation%20requirements.%20In%20this%20study%2C%20we%20investigate%20optimizing%20the%20KV%20cache%2C%20whose%20memory%20footprint%20poses%20a%20critical%20bottleneck%20in%20LLM%20inference%2C%20especially%20when%20dealing%20with%20long%20context%20tasks.%20To%20tackle%20the%20challenge%2C%20we%20introduce%20MiniKV%2C%20a%20KV%20cache%20optimization%20method%20that%20simultaneously%20preserves%20long%20context%20task%20accuracy%20while%20significantly%20reducing%20KV%20cache%20size%20via%20a%20novel%202-bit%20layer-discriminative%20KV%20cache.%20More%20importantly%2C%20we%20develop%20specialized%20CUDA%20kernels%20to%20make%20MiniKV%20compatible%20with%20FlashAttention.%20Experiments%20on%20a%20wide%20range%20of%20long%20context%20tasks%20show%20that%20MiniKV%20effectively%20achieves%2086%25%20KV%20cache%20compression%20ratio%20while%20recovering%20over%2098.5%25%20of%20accuracy%2C%20outperforming%20state-of-the-art%20methods%20while%20achieving%20excellent%20measured%20system%20performance%20improvements.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2411.18077%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2411.18077%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-12-02T17%3A37%3A49Z%22%7D%7D%2C%7B%22key%22%3A%22G3P9HWT6%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Rao%20et%20al.%22%2C%22parsedDate%22%3A%222024-11%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ERao%2C%20R.%2C%20Chandrasekar%2C%20K.%20%26amp%3B%20Kale%2C%20L.%20An%20Adaptive%20Asynchronous%20Approach%20for%20the%20Single-Source%20Shortest%20Paths%20Problem.%20in%20%3Ci%3EIA%5E3%202024%20-%2014th%20Workshop%20on%20Irregular%20Applications%3A%20Architectures%20%26amp%3B%20Algorithms%3C%5C%2Fi%3E%20%28Atlanta%2C%20Georgia%2C%202024%29.%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27http%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FSCW63240.2024.00097%27%3Ehttp%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1109%5C%2FSCW63240.2024.00097%3C%5C%2Fa%3E.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22An%20Adaptive%20Asynchronous%20Approach%20for%20the%20Single-Source%20Shortest%20Paths%20Problem%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ritvik%22%2C%22lastName%22%3A%22Rao%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kavitha%22%2C%22lastName%22%3A%22Chandrasekar%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Laxmikant%22%2C%22lastName%22%3A%22Kale%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%22November%2C%202024%22%2C%22proceedingsTitle%22%3A%22IA%5E3%202024%20-%2014th%20Workshop%20on%20Irregular%20Applications%3A%20Architectures%20%26%20Algorithms%22%2C%22conferenceName%22%3A%22SC24%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FSCW63240.2024.00097%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-12-02T17%3A34%3A37Z%22%7D%7D%2C%7B%22key%22%3A%22U5A3V9XS%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Abedsoltan%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EAbedsoltan%2C%20A.%2C%20Ma%2C%20S.%2C%20Pandit%2C%20P.%20%26amp%3B%20Belkin%2C%20M.%20Fast%20training%20of%20large%20kernel%20models%20with%20delayed%20projections.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.16658%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.16658%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Fast%20training%20of%20large%20kernel%20models%20with%20delayed%20projections%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amirhesam%22%2C%22lastName%22%3A%22Abedsoltan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Siyuan%22%2C%22lastName%22%3A%22Ma%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Parthe%22%2C%22lastName%22%3A%22Pandit%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mikhail%22%2C%22lastName%22%3A%22Belkin%22%7D%5D%2C%22abstractNote%22%3A%22Classical%20kernel%20machines%20have%20historically%20faced%20significant%20challenges%20in%20scaling%20to%20large%20datasets%20and%20model%20sizes--a%20key%20ingredient%20that%20has%20driven%20the%20success%20of%20neural%20networks.%20In%20this%20paper%2C%20we%20present%20a%20new%20methodology%20for%20building%20kernel%20machines%20that%20can%20scale%20efficiently%20with%20both%20data%20size%20and%20model%20size.%20Our%20algorithm%20introduces%20delayed%20projections%20to%20Preconditioned%20Stochastic%20Gradient%20Descent%20%28PSGD%29%20allowing%20the%20training%20of%20much%20larger%20models%20than%20was%20previously%20feasible%2C%20pushing%20the%20practical%20limits%20of%20kernel-based%20learning.%20We%20validate%20our%20algorithm%2C%20EigenPro4%2C%20across%20multiple%20datasets%2C%20demonstrating%20drastic%20training%20speed%20up%20over%20the%20existing%20methods%20while%20maintaining%20comparable%20or%20better%20classification%20accuracy.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2411.16658%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2411.16658%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-12-02T17%3A15%3A12Z%22%7D%7D%2C%7B%22key%22%3A%22IYR8UHHI%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chau%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EChau%2C%20T.%20N.%2C%20Wang%2C%20X.%2C%20McDowell%2C%20J.%20M.%20%26amp%3B%20Li%2C%20S.%20Advancing%20plant%20single-cell%20genomics%20with%20foundation%20models.%20%3Ci%3ECurrent%20Opinion%20in%20Plant%20Biology%3C%5C%2Fi%3E%20%3Cb%3E82%3C%5C%2Fb%3E%2C%20102666%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Advancing%20plant%20single-cell%20genomics%20with%20foundation%20models%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tran%20N.%22%2C%22lastName%22%3A%22Chau%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xuan%22%2C%22lastName%22%3A%22Wang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22John%20M.%22%2C%22lastName%22%3A%22McDowell%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Song%22%2C%22lastName%22%3A%22Li%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%2212%5C%2F2024%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.pbi.2024.102666%22%2C%22ISSN%22%3A%2213695266%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flinkinghub.elsevier.com%5C%2Fretrieve%5C%2Fpii%5C%2FS1369526624001572%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-25T13%3A46%3A58Z%22%7D%7D%2C%7B%22key%22%3A%22DJZS52CY%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kim%20et%20al.%22%2C%22parsedDate%22%3A%222024-11-20%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EKim%2C%20Y.-J.%2C%20Waegel%2C%20A.%2C%20Hakkarainen%2C%20M.%2C%20Yi%2C%20Y.%20K.%20%26amp%3B%20Braham%2C%20W.%20Understanding%20HVAC%20system%20runtime%20of%20U.S.%20homes%3A%20An%20energy%20signature%20analysis%20using%20smart%20thermostat%20data.%20%3Ci%3EBuild.%20Simul.%3C%5C%2Fi%3E%20%282024%29%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27http%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs12273-024-1203-9%27%3Ehttp%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2Fs12273-024-1203-9%3C%5C%2Fa%3E.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Understanding%20HVAC%20system%20runtime%20of%20U.S.%20homes%3A%20An%20energy%20signature%20analysis%20using%20smart%20thermostat%20data%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22You-Jeong%22%2C%22lastName%22%3A%22Kim%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexander%22%2C%22lastName%22%3A%22Waegel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Max%22%2C%22lastName%22%3A%22Hakkarainen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yun%20Kyu%22%2C%22lastName%22%3A%22Yi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22William%22%2C%22lastName%22%3A%22Braham%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-11-20%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2Fs12273-024-1203-9%22%2C%22ISSN%22%3A%221996-3599%2C%201996-8744%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2Fs12273-024-1203-9%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-22T15%3A39%3A28Z%22%7D%7D%2C%7B%22key%22%3A%22CSDNM3KR%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Dhruv%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EDhruv%2C%20V.%2C%20Prather%2C%20B.%2C%20Wong%2C%20G.%20%26amp%3B%20Gammie%2C%20C.%20F.%20A%20Survey%20of%20General%20Relativistic%20Magnetohydrodynamic%20Models%20for%20Black%20Hole%20Accretion%20Systems.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.12647%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.12647%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22A%20Survey%20of%20General%20Relativistic%20Magnetohydrodynamic%20Models%20for%20Black%20Hole%20Accretion%20Systems%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Vedant%22%2C%22lastName%22%3A%22Dhruv%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ben%22%2C%22lastName%22%3A%22Prather%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22George%22%2C%22lastName%22%3A%22Wong%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Charles%20F.%22%2C%22lastName%22%3A%22Gammie%22%7D%5D%2C%22abstractNote%22%3A%22General%20Relativistic%20Magnetohydrodynamics%20%28GRMHD%29%20simulations%20are%20an%20indispensable%20tool%20in%20studying%20accretion%20onto%20compact%20objects.%20The%20Event%20Horizon%20Telescope%20%28EHT%29%20frequently%20uses%20libraries%20of%20ideal%20GRMHD%20simulations%20to%20interpret%20polarimetric%2C%20event-horizon-scale%20observations%20of%20supermassive%20black%20holes%20at%20the%20centers%20of%20galaxies.%20In%20this%20work%2C%20we%20present%20a%20library%20of%20ten%20non-radiative%2C%20ideal%20GRMHD%20simulations%20that%20were%20utilized%20by%20the%20EHT%20Collaboration%20in%20their%20analysis%20of%20Sagittarius%20A%2A.%20The%20parameter%20survey%20explores%20both%20low%20%28SANE%29%20and%20high%20%28MAD%29%20magnetization%20states%20across%20five%20black%20hole%20spins%20%24a_%7B%2A%7D%3D-15%5C%2F16%2C-1%5C%2F2%2C0%2C%2B1%5C%2F2%2C%2B15%5C%2F16%24%20where%20each%20simulation%20was%20run%20out%20to%20%2430%2C000%5C%5Chspace%7B0.1cm%7D%5C%5Cmathrm%7BGM%5C%2Fc%7D%5E%7B3%7D%24.%20We%20find%20the%20angular%20momentum%20and%20energy%20flux%20in%20SANE%20simulations%20closely%20matches%20the%20thin-disk%20value%2C%20with%20minor%20deviations%20in%20prograde%20models%20due%20to%20fluid%20forces.%20This%20leads%20to%20spin%20equilibrium%20around%20%24a_%7B%2A%7D%5C%5Csim0.94%24%2C%20consistent%20with%20previous%20studies.%20We%20study%20the%20flow%20of%20conserved%20quantities%20in%20our%20simulations%20and%20find%20mass%2C%20angular%20momentum%2C%20and%20energy%20transport%20in%20SANE%20accretion%20flows%20to%20be%20primarily%20inward%20and%20fluid-dominated.%20MAD%20models%20produce%20powerful%20jets%20with%20outflow%20efficiency%20%24%26gt%3B1%24%20for%20%24a_%7B%2A%7D%3D%2B0.94%24%2C%20leading%20to%20black%20hole%20spin-down%20in%20prograde%20cases.%20We%20observe%20outward%20directed%20energy%20and%20angular%20momentum%20fluxes%20on%20the%20horizon%2C%20as%20expected%20for%20the%20Blandford-Znajek%20mechanism.%20MAD%20accretion%20flows%20are%20sub-Keplerian%20and%20exhibit%20greater%20variability%20than%20their%20SANE%20counterpart.%20They%20are%20also%20hotter%20than%20SANE%20disks%20within%20%24r%5C%5Clesssim%2010%5C%5Chspace%7B0.1cm%7D%5C%5Cmathrm%7BGM%5C%2Fc%7D%5E%7B2%7D%24.%20This%20study%20is%20accompanied%20by%20a%20public%20release%20of%20simulation%20data%20at%20%5C%5Curl%7Bhttp%3A%5C%2F%5C%2Fthz.astro.illinois.edu%5C%2F%7D.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2411.12647%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2411.12647%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-22T15%3A36%3A36Z%22%7D%7D%2C%7B%22key%22%3A%22E55GGNHC%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kasirajan%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EKasirajan%2C%20V.%2C%20Battelle%2C%20T.%20%26amp%3B%20Wold%2C%20B.%20Empowering%20Large%20Scale%20Quantum%20Circuit%20Development%3A%20Effective%20Simulation%20of%20Sycamore%20Circuits.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.12131%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.12131%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Empowering%20Large%20Scale%20Quantum%20Circuit%20Development%3A%20Effective%20Simulation%20of%20Sycamore%20Circuits%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Venkateswaran%22%2C%22lastName%22%3A%22Kasirajan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Torey%22%2C%22lastName%22%3A%22Battelle%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bob%22%2C%22lastName%22%3A%22Wold%22%7D%5D%2C%22abstractNote%22%3A%22Simulating%20quantum%20systems%20using%20classical%20computing%20equipment%20has%20been%20a%20significant%20research%20focus.%20This%20work%20demonstrates%20that%20circuits%20as%20large%20and%20complex%20as%20the%20random%20circuit%20sampling%20%28RCS%29%20circuits%20published%20as%20a%20part%20of%20Google%27s%20pioneering%20work%20%5B4-7%5D%20claiming%20quantum%20supremacy%20can%20be%20effectively%20simulated%20with%20high%20fidelity%20on%20classical%20systems%20commonly%20available%20to%20developers%2C%20using%20the%20universal%20quantum%20simulator%20included%20in%20the%20Quantum%20Rings%20SDK%2C%20making%20this%20advancement%20accessible%20to%20everyone.%20This%20study%20achieved%20an%20average%20linear%20cross-entropy%20benchmarking%20%28XEB%29%20score%20of%200.678%2C%20indicating%20a%20strong%20correlation%20with%20ideal%20quantum%20simulation%20and%20exceeding%20the%20XEB%20values%20currently%20reported%20for%20the%20same%20circuits%20today%20while%20completing%20circuit%20execution%20in%20a%20reasonable%20timeframe.%20This%20capability%20empowers%20researchers%20and%20developers%20to%20build%2C%20debug%2C%20and%20execute%20large-scale%20quantum%20circuits%20ahead%20of%20the%20general%20availability%20of%20low-error%20rate%20quantum%20computers%20and%20invent%20new%20quantum%20algorithms%20or%20deploy%20commercial-grade%20applications.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2411.12131%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2411.12131%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-22T15%3A34%3A38Z%22%7D%7D%2C%7B%22key%22%3A%22NPMSRMTI%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Griebel%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EGriebel%2C%20S.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Locating%20the%20Leading%20Edge%20of%20cultural%20Change.%20in%20%28Aarhus%2C%20Denmark%2C%202024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Locating%20the%20Leading%20Edge%20of%20cultural%20Change%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sarah%22%2C%22lastName%22%3A%22Griebel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Becca%22%2C%22lastName%22%3A%22Cohen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lucian%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jaihyun%22%2C%22lastName%22%3A%22Park%2C%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jiayu%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jana%22%2C%22lastName%22%3A%22Perkins%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ted%22%2C%22lastName%22%3A%22Underwood%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22proceedingsTitle%22%3A%22%22%2C%22conferenceName%22%3A%22CHR%202024%3A%20Computational%20Humanities%20Research%20Conference%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fceur-ws.org%5C%2FVol-3834%5C%2Fpaper70.pdf%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T18%3A18%3A49Z%22%7D%7D%2C%7B%22key%22%3A%222SBUGF4S%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Liu%20et%20al.%22%2C%22parsedDate%22%3A%222024-11-19%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ELiu%2C%20Z.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Accurate%20Ring%20Strain%20Energy%20Predictions%20with%20Machine%20Learning%20and%20Application%20in%20Strain-Promoted%20Reactions.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.26434%5C%2Fchemrxiv-2024-dtq6q%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.26434%5C%2Fchemrxiv-2024-dtq6q%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Accurate%20Ring%20Strain%20Energy%20Predictions%20with%20Machine%20Learning%20and%20Application%20in%20Strain-Promoted%20Reactions%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zhen%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jessica%22%2C%22lastName%22%3A%22Vinskus%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yue%22%2C%22lastName%22%3A%22Fu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peng%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kevin%22%2C%22lastName%22%3A%22Noonan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Olexandr%22%2C%22lastName%22%3A%22Isayev%22%7D%5D%2C%22abstractNote%22%3A%22Ring%20strain%20energy%20%28RSE%29%20is%20crucial%20for%20understanding%20molecular%20reactivity.%20However%2C%20quantitatively%20determining%20RSE%20through%20experiments%20or%20quantum%20mechanics%20is%20resource-intensive%2C%20limiting%20its%20application%20on%20a%20large%20scale.%20We%20developed%20a%20physics-based%20workflow%20and%20a%20data-driven%20graph%20neural%20network%20%28GNN%29%20capable%20of%20reliably%20predicting%20RSE%20in%20minutes%20or%20milliseconds%2C%20respectively.%20For%20each%20molecule%2C%20the%20workflow%20first%20identifies%20low-energy%20conformers%2C%20then%20computes%20the%20RSE%20using%20the%20AIMNet2%20machine%20learning%20interatomic%20potentials.%20We%20validated%20the%20approach%20both%20computationally%20and%20experimentally.%20Compared%20to%20the%20%5Cu03c9B97M-D4%5C%2FDef2-TZVPP%20method%2C%20the%20workflow%20achieved%20an%20R%5Cu00b2%20value%20of%200.997%20and%20a%20mean%20absolute%20error%20%28MAE%29%20of%200.896%20kcal%5C%2Fmol.%20Using%20this%20workflow%2C%20we%20distinguished%20reactive%20from%20non-reactive%20molecules%20in%20copper-free%20click%20chemistry%20and%20ring-opening%20metathesis%20polymerization%2C%20demonstrating%20the%20workflow%27s%20generalizability%20to%20diverse%20molecules.%20Furthermore%2C%20we%20compiled%20%5C%22RSE%20Atlas%2C%5C%22%20a%20computational%20database%20of%2016%2C905%20single-ring%20molecules%2C%20providing%20a%20rich%20resource%20for%20examining%20factors%20influencing%20RSE.%20Employing%20this%20dataset%2C%20we%20trained%20a%20GNN%20that%20predicts%20RSE%20in%20milliseconds%20using%20only%202D%20molecular%20information.%20Our%20methods%20render%20RSE%20a%20readily%20computable%20property%20for%20on-the-fly%20applications%20in%20experimental%20and%20computational%20work.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024-11-19%22%2C%22DOI%22%3A%2210.26434%5C%2Fchemrxiv-2024-dtq6q%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fchemrxiv.org%5C%2Fengage%5C%2Fchemrxiv%5C%2Farticle-details%5C%2F673aaaf97be152b1d06faaa7%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T18%3A06%3A27Z%22%7D%7D%2C%7B%22key%22%3A%22GYV4A46H%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Schmaltz%20et%20al.%22%2C%22parsedDate%22%3A%222024-11-17%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ESchmaltz%2C%20T.%2C%20Hu%2C%20Y.%20%26amp%3B%20Lazarian%2C%20A.%20Estimate%20Sonic%20Mach%20Number%20in%20the%20Interstellar%20Medium%20with%20Convolutional%20Neural%20Network.%20Preprint%20at%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2411.11157%27%3Ehttp%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2411.11157%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Estimate%20Sonic%20Mach%20Number%20in%20the%20Interstellar%20Medium%20with%20Convolutional%20Neural%20Network%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tyler%22%2C%22lastName%22%3A%22Schmaltz%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yue%22%2C%22lastName%22%3A%22Hu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alex%22%2C%22lastName%22%3A%22Lazarian%22%7D%5D%2C%22abstractNote%22%3A%22Understanding%20the%20role%20of%20turbulence%20in%20shaping%20the%20interstellar%20medium%20%28ISM%29%20is%20crucial%20for%20studying%20star%20formation%2C%20molecular%20cloud%20evolution%2C%20and%20cosmic%20ray%20propagation.%20Central%20to%20this%20is%20the%20measurement%20of%20the%20sonic%20Mach%20number%20%28%24M_s%24%29%2C%20which%20quantifies%20the%20ratio%20of%20turbulent%20velocity%20to%20the%20sound%20speed.%20In%20this%20work%2C%20we%20introduce%20a%20convolutional%20neural%20network%20%28CNN%29-based%20approach%20for%20estimating%20%24M_s%24%20directly%20from%20spectroscopic%20observations.%20The%20approach%20leverages%20the%20physical%20correlation%20between%20increasing%20%24M_s%24%20and%20the%20shock-induced%20small-scale%20fluctuations%20that%20alter%20the%20morphological%20features%20in%20intensity%2C%20velocity%20centroid%2C%20and%20velocity%20channel%20maps.%20These%20maps%2C%20derived%20from%203D%20magnetohydrodynamic%20%28MHD%29%20turbulence%20simulations%2C%20serve%20as%20inputs%20for%20the%20CNN%20training.%20By%20learning%20the%20relationship%20between%20these%20structural%20features%20and%20the%20underlying%20turbulence%20properties%2C%20CNN%20can%20predict%20%24M_s%24%20under%20various%20conditions%2C%20including%20different%20magnetic%20fields%20and%20levels%20of%20observational%20noise.%20The%20median%20uncertainty%20of%20the%20CNN-predicted%20%24M_s%24%20ranges%20from%200.5%20to%201.5%20depending%20on%20the%20noise%20level.%20While%20intensity%20maps%20offer%20lower%20uncertainty%2C%20channel%20maps%20have%20the%20advantage%20of%20predicting%20the%203D%20%24M_s%24%20distribution%2C%20which%20is%20crucial%20in%20estimating%203D%20magnetic%20field%20strength.%20Our%20results%20demonstrate%20that%20machine-learning-based%20tools%20can%20effectively%20characterize%20complex%20turbulence%20properties%20in%20the%20ISM.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22arXiv%3A2411.11157%22%2C%22date%22%3A%222024-11-17%22%2C%22DOI%22%3A%22%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2411.11157%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A56%3A08Z%22%7D%7D%2C%7B%22key%22%3A%22NZLI2IK2%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Roze%20et%20al.%22%2C%22parsedDate%22%3A%222024-11-17%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ERoze%2C%20L.%20V.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Increasing%20thermostability%20of%20the%20key%20photorespiratory%20enzyme%20glycerate%203%26%23x2010%3Bkinase%20by%20structure%26%23x2010%3Bbased%20recombination.%20%3Ci%3EPlant%20Biotechnology%20Journal%3C%5C%2Fi%3E%20pbi.14508%20%282024%29%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27http%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1111%5C%2Fpbi.14508%27%3Ehttp%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1111%5C%2Fpbi.14508%3C%5C%2Fa%3E.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Increasing%20thermostability%20of%20the%20key%20photorespiratory%20enzyme%20glycerate%203%5Cu2010kinase%20by%20structure%5Cu2010based%20recombination%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ludmila%20V.%22%2C%22lastName%22%3A%22Roze%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anna%22%2C%22lastName%22%3A%22Antoniak%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Daipayan%22%2C%22lastName%22%3A%22Sarkar%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aaron%20H.%22%2C%22lastName%22%3A%22Liepman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mauricio%22%2C%22lastName%22%3A%22Tejera%5Cu2010Nieves%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Josh%20V.%22%2C%22lastName%22%3A%22Vermaas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Berkley%20J.%22%2C%22lastName%22%3A%22Walker%22%7D%5D%2C%22abstractNote%22%3A%22Summary%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20As%20global%20temperatures%20rise%2C%20improving%20crop%20yields%20will%20require%20enhancing%20the%20thermotolerance%20of%20crops.%20One%20approach%20for%20improving%20thermotolerance%20is%20using%20bioengineering%20to%20increase%20the%20thermostability%20of%20enzymes%20catalysing%20essential%20biological%20processes.%20Photorespiration%20is%20an%20essential%20recycling%20process%20in%20plants%20that%20is%20integral%20to%20photosynthesis%20and%20crop%20growth.%20The%20enzymes%20of%20photorespiration%20are%20targets%20for%20enhancing%20plant%20thermotolerance%20as%20this%20pathway%20limits%20carbon%20fixation%20at%20elevated%20temperatures.%20We%20explored%20the%20effects%20of%20temperature%20on%20the%20activity%20of%20the%20photorespiratory%20enzyme%20glycerate%20kinase%20%28GLYK%29%20from%20various%20organisms%20and%20the%20homologue%20from%20the%20thermophilic%20alga%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20Cyanidioschyzon%20merolae%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20was%20more%20thermotolerant%20than%20those%20from%20mesophilic%20plants%2C%20including%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20Arabidopsis%20thaliana%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20.%20To%20understand%20enzyme%20features%20underlying%20the%20thermotolerance%20of%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20C.%5Cu2009merolae%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20GLYK%20%28CmGLYK%29%2C%20we%20performed%20molecular%20dynamics%20simulations%20using%20AlphaFold%5Cu2010predicted%20structures%2C%20which%20revealed%20greater%20movement%20of%20loop%20regions%20of%20mesophilic%20plant%20GLYKs%20at%20higher%20temperatures%20compared%20to%20CmGLYK.%20Based%20on%20these%20simulations%2C%20hybrid%20proteins%20were%20produced%20and%20analysed.%20These%20hybrid%20enzymes%20contained%20loop%20regions%20from%20CmGLYK%20replacing%20the%20most%20mobile%20corresponding%20loops%20of%20AtGLYK.%20Two%20of%20these%20hybrid%20enzymes%20had%20enhanced%20thermostability%2C%20with%20melting%20temperatures%20increased%20by%206%5Cu2009%5Cu00b0C.%20One%20hybrid%20with%20three%20grafted%20loops%20maintained%20higher%20activity%20at%20elevated%20temperatures.%20Whilst%20this%20hybrid%20enzyme%20exhibited%20enhanced%20thermostability%20and%20a%20similar%20K%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20m%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20for%20ATP%20compared%20to%20AtGLYK%2C%20its%20K%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20m%20%5Cn%20%20%20%20%20%20%20%20%20%20%20%20%20%20for%20glycerate%20increased%20threefold.%20This%20study%20demonstrates%20that%20molecular%20dynamics%20simulation%5Cu2010guided%20structure%5Cu2010based%20recombination%20offers%20a%20promising%20strategy%20for%20enhancing%20the%20thermostability%20of%20other%20plant%20enzymes%20with%20possible%20application%20to%20increasing%20the%20thermotolerance%20of%20plants%20under%20warming%20climates.%22%2C%22date%22%3A%222024-11-17%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1111%5C%2Fpbi.14508%22%2C%22ISSN%22%3A%221467-7644%2C%201467-7652%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2F10.1111%5C%2Fpbi.14508%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A52%3A35Z%22%7D%7D%2C%7B%22key%22%3A%22LY5CB5YT%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Liu%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ELiu%2C%20H.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Time-MMD%3A%20Multi-Domain%20Multimodal%20Dataset%20for%20Time%20Series%20Analysis.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2406.08627%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2406.08627%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Time-MMD%3A%20Multi-Domain%20Multimodal%20Dataset%20for%20Time%20Series%20Analysis%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Haoxin%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Shangqing%22%2C%22lastName%22%3A%22Xu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zhiyuan%22%2C%22lastName%22%3A%22Zhao%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lingkai%22%2C%22lastName%22%3A%22Kong%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Harshavardhan%22%2C%22lastName%22%3A%22Kamarthi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aditya%20B.%22%2C%22lastName%22%3A%22Sasanur%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Megha%22%2C%22lastName%22%3A%22Sharma%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jiaming%22%2C%22lastName%22%3A%22Cui%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Qingsong%22%2C%22lastName%22%3A%22Wen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chao%22%2C%22lastName%22%3A%22Zhang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22B.%20Aditya%22%2C%22lastName%22%3A%22Prakash%22%7D%5D%2C%22abstractNote%22%3A%22Time%20series%20data%20are%20ubiquitous%20across%20a%20wide%20range%20of%20real-world%20domains.%20While%20real-world%20time%20series%20analysis%20%28TSA%29%20requires%20human%20experts%20to%20integrate%20numerical%20series%20data%20with%20multimodal%20domain-specific%20knowledge%2C%20most%20existing%20TSA%20models%20rely%20solely%20on%20numerical%20data%2C%20overlooking%20the%20significance%20of%20information%20beyond%20numerical%20series.%20This%20oversight%20is%20due%20to%20the%20untapped%20potential%20of%20textual%20series%20data%20and%20the%20absence%20of%20a%20comprehensive%2C%20high-quality%20multimodal%20dataset.%20To%20overcome%20this%20obstacle%2C%20we%20introduce%20Time-MMD%2C%20the%20first%20multi-domain%2C%20multimodal%20time%20series%20dataset%20covering%209%20primary%20data%20domains.%20Time-MMD%20ensures%20fine-grained%20modality%20alignment%2C%20eliminates%20data%20contamination%2C%20and%20provides%20high%20usability.%20Additionally%2C%20we%20develop%20MM-TSFlib%2C%20the%20first%20multimodal%20time-series%20forecasting%20%28TSF%29%20library%2C%20seamlessly%20pipelining%20multimodal%20TSF%20evaluations%20based%20on%20Time-MMD%20for%20in-depth%20analyses.%20Extensive%20experiments%20conducted%20on%20Time-MMD%20through%20MM-TSFlib%20demonstrate%20significant%20performance%20enhancements%20by%20extending%20unimodal%20TSF%20to%20multimodality%2C%20evidenced%20by%20over%2015%25%20mean%20squared%20error%20reduction%20in%20general%2C%20and%20up%20to%2040%25%20in%20domains%20with%20rich%20textual%20data.%20More%20importantly%2C%20our%20datasets%20and%20library%20revolutionize%20broader%20applications%2C%20impacts%2C%20research%20topics%20to%20advance%20TSA.%20The%20dataset%20and%20library%20are%20available%20at%20https%3A%5C%2F%5C%2Fgithub.com%5C%2FAdityaLab%5C%2FTime-MMD%20and%20https%3A%5C%2F%5C%2Fgithub.com%5C%2FAdityaLab%5C%2FMM-TSFlib.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2406.08627%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2406.08627%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A51%3A29Z%22%7D%7D%2C%7B%22key%22%3A%223VNRQ4FZ%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Liu%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ELiu%2C%20H.%2C%20Liu%2C%20C.%20%26amp%3B%20Prakash%2C%20B.%20A.%20A%20Picture%20is%20Worth%20A%20Thousand%20Numbers%3A%20Enabling%20LLMs%20Reason%20about%20Time%20Series%20via%20Visualization.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.06018%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.06018%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22A%20Picture%20is%20Worth%20A%20Thousand%20Numbers%3A%20Enabling%20LLMs%20Reason%20about%20Time%20Series%20via%20Visualization%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Haoxin%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chenghao%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22B.%20Aditya%22%2C%22lastName%22%3A%22Prakash%22%7D%5D%2C%22abstractNote%22%3A%22Large%20language%20models%20%28LLMs%29%2C%20with%20demonstrated%20reasoning%20abilities%20across%20multiple%20domains%2C%20are%20largely%20underexplored%20for%20time-series%20reasoning%20%28TsR%29%2C%20which%20is%20ubiquitous%20in%20the%20real%20world.%20In%20this%20work%2C%20we%20propose%20TimerBed%2C%20the%20first%20comprehensive%20testbed%20for%20evaluating%20LLMs%27%20TsR%20performance.%20Specifically%2C%20TimerBed%20includes%20stratified%20reasoning%20patterns%20with%20real-world%20tasks%2C%20comprehensive%20combinations%20of%20LLMs%20and%20reasoning%20strategies%2C%20and%20various%20supervised%20models%20as%20comparison%20anchors.%20We%20perform%20extensive%20experiments%20with%20TimerBed%2C%20test%20multiple%20current%20beliefs%2C%20and%20verify%20the%20initial%20failures%20of%20LLMs%20in%20TsR%2C%20evidenced%20by%20the%20ineffectiveness%20of%20zero%20shot%20%28ZST%29%20and%20performance%20degradation%20of%20few%20shot%20in-context%20learning%20%28ICL%29.%20Further%2C%20we%20identify%20one%20possible%20root%20cause%3A%20the%20numerical%20modeling%20of%20data.%20To%20address%20this%2C%20we%20propose%20a%20prompt-based%20solution%20VL-Time%2C%20using%20visualization-modeled%20data%20and%20language-guided%20reasoning.%20Experimental%20results%20demonstrate%20that%20Vl-Time%20enables%20multimodal%20LLMs%20to%20be%20non-trivial%20ZST%20and%20powerful%20ICL%20reasoners%20for%20time%20series%2C%20achieving%20about%20140%25%20average%20performance%20improvement%20and%2099%25%20average%20token%20costs%20reduction.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2411.06018%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2411.06018%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A39%3A17Z%22%7D%7D%2C%7B%22key%22%3A%22STEQSLBJ%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Yang%20et%20al.%22%2C%22parsedDate%22%3A%222024-11-19%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EYang%2C%20B.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Engineering%20the%20Mechanical%20Stability%20of%20a%20Therapeutic%20Complex%20between%20Affibody%20and%20Programmed%20Death-Ligand%201%20by%20Anchor%20Point%20Selection.%20%3Ci%3EACS%20Nano%3C%5C%2Fi%3E%20%3Cb%3E18%3C%5C%2Fb%3E%2C%2031912%26%23x2013%3B31922%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Engineering%20the%20Mechanical%20Stability%20of%20a%20Therapeutic%20Complex%20between%20Affibody%20and%20Programmed%20Death-Ligand%201%20by%20Anchor%20Point%20Selection%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Byeongseon%22%2C%22lastName%22%3A%22Yang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Diego%20E.%20B.%22%2C%22lastName%22%3A%22Gomes%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zhaowei%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mariana%20S%5Cu00e1%22%2C%22lastName%22%3A%22Santos%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jiajun%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rafael%20C.%22%2C%22lastName%22%3A%22Bernardi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%20A.%22%2C%22lastName%22%3A%22Nash%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222024-11-19%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1021%5C%2Facsnano.4c09220%22%2C%22ISSN%22%3A%221936-0851%2C%201936-086X%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fpubs.acs.org%5C%2Fdoi%5C%2F10.1021%5C%2Facsnano.4c09220%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A38%3A20Z%22%7D%7D%2C%7B%22key%22%3A%22Z4WKXBGU%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Nguyen%20et%20al.%22%2C%22parsedDate%22%3A%222024-11-08%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ENguyen%2C%20T.-D.%2C%20Zhang%2C%20C.%2C%20Gitbumrungsin%2C%20M.%2C%20Raheja%2C%20A.%20%26amp%3B%20Chen%2C%20T.%20Remote%20Kinematic%20Analysis%20for%20Mobility%20Scooter%20Riders%20Leveraging%20Edge%20AI.%20%3Ci%3EAAAI-SS%3C%5C%2Fi%3E%20%3Cb%3E4%3C%5C%2Fb%3E%2C%20314%26%23x2013%3B318%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Remote%20Kinematic%20Analysis%20for%20Mobility%20Scooter%20Riders%20Leveraging%20Edge%20AI%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Thanh-Dat%22%2C%22lastName%22%3A%22Nguyen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chenrui%22%2C%22lastName%22%3A%22Zhang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Melvin%22%2C%22lastName%22%3A%22Gitbumrungsin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amar%22%2C%22lastName%22%3A%22Raheja%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tingting%22%2C%22lastName%22%3A%22Chen%22%7D%5D%2C%22abstractNote%22%3A%22Current%20kinematic%20analysis%20for%20patients%20with%20upper%20or%20lower%20%5Cnextremity%20challenges%20is%20usually%20performed%20indoors%20at%20the%20clin-%20%5Cnics%2C%20which%20may%20not%20always%20be%20accessible%20for%20all%20patients.%20On%20%5Cnthe%20other%20hand%2C%20mobility%20scooter%20is%20a%20popular%20assistive%20tool%20%5Cnused%20by%20people%20with%20mobility%20disabilities.%20In%20this%20study%2C%20we%20%5Cnintroduce%20a%20remote%20kinematic%20analysis%20system%20for%20mobility%20%5Cnscooter%20riders%20to%20use%20in%20their%20local%20communities.%20In%20order%20to%20%5Cntrain%20the%20human%20pose%20estimation%20model%20for%20the%20kinematic%20anal-%20%5Cnysis%20application%2C%20we%20have%20collected%20our%20own%20mobility%20scooter%20%5Cnriding%20video%20dataset%20which%20captures%20riders%5Cu2019%20upper-body%20move-%20%5Cnments.%20The%20ground%20truth%20data%20is%20labeled%20by%20the%20collaborating%20%5Cnclinicians.%20The%20evaluation%20results%20show%20high%20system%20accuracy%20%5Cnboth%20in%20the%20keypoints%20prediction%20and%20in%20the%20downstream%20kine-%20%5Cnmatic%20analysis%2C%20compared%20with%20the%20general-purpose%20pose%20mod-%20%5Cnels.%20Our%20efficiency%20test%20results%20on%20NVIDIA%20Jetson%20Orin%20Nano%20%5Cnalso%20validate%20the%20feasibility%20of%20running%20the%20system%20in%20real-time%20%5Cnon%20edge%20devices.%22%2C%22date%22%3A%222024-11-08%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1609%5C%2Faaaiss.v4i1.31808%22%2C%22ISSN%22%3A%222994-4317%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fojs.aaai.org%5C%2Findex.php%5C%2FAAAI-SS%5C%2Farticle%5C%2Fview%5C%2F31808%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A36%3A15Z%22%7D%7D%2C%7B%22key%22%3A%22UQKW3VJ5%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chandrasekar%20and%20Kale%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EChandrasekar%2C%20K.%20%26amp%3B%20Kale%2C%20L.%20Shared%20Memory-Aware%20Latency-Sensitive%20Message%20Aggregation%20for%20Fine-Grained%20Communication.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.03533%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2411.03533%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Shared%20Memory-Aware%20Latency-Sensitive%20Message%20Aggregation%20for%20Fine-Grained%20Communication%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kavitha%22%2C%22lastName%22%3A%22Chandrasekar%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Laxmikant%22%2C%22lastName%22%3A%22Kale%22%7D%5D%2C%22abstractNote%22%3A%22Message%20aggregation%20is%20often%20used%20with%20a%20goal%20to%20reduce%20communication%20cost%20in%20HPC%20applications.%20The%20difference%20in%20the%20order%20of%20overhead%20of%20sending%20a%20message%20and%20cost%20of%20per%20byte%20transferred%20motivates%20the%20need%20for%20message%20aggregation%2C%20for%20several%20irregular%20fine-grained%20messaging%20applications%20like%20graph%20algorithms%20and%20parallel%20discrete%20event%20simulation%20%28PDES%29.%20While%20message%20aggregation%20is%20frequently%20utilized%20in%20%5C%22MPI-everywhere%5C%22%20model%2C%20to%20coalesce%20messages%20between%20processes%20mapped%20to%20cores%2C%20such%20aggregation%20across%20threads%20in%20a%20process%2C%20say%20in%20MPI%2BX%20models%20or%20Charm%2B%2B%20SMP%20%28Shared%20Memory%20Parallelism%29%20mode%2C%20is%20often%20avoided.%20Within-process%20coalescing%20is%20likely%20to%20require%20synchronization%20across%20threads%20and%20lead%20to%20performance%20issues%20from%20contention.%20However%2C%20as%20a%20result%2C%20SMP-unaware%20aggregation%20mechanisms%20may%20not%20fully%20utilize%20aggregation%20opportunities%20available%20to%20applications%20in%20SMP%20mode.%20Additionally%2C%20while%20the%20benefit%20of%20message%20aggregation%20is%20often%20analyzed%20in%20terms%20of%20reducing%20the%20overhead%2C%20specifically%20the%20per%20message%20cost%2C%20we%20also%20analyze%20different%20schemes%20that%20can%20aid%20in%20reducing%20the%20message%20latency%2C%20ie.%20the%20time%20from%20when%20a%20message%20is%20sent%20to%20the%20time%20when%20it%20is%20received.%20Message%20latency%20can%20affect%20several%20applications%20like%20PDES%20with%20speculative%20execution%20where%20reducing%20message%20latency%20could%20result%20in%20fewer%20rollbacks.%20To%20address%20these%20challenges%2C%20in%20our%20work%2C%20we%20demonstrate%20the%20effectiveness%20of%20shared%20memory-aware%20message%20aggregation%20schemes%20for%20a%20range%20of%20proxy%20applications%20with%20respect%20to%20messaging%20overhead%20and%20latency.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2411.03533%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2411.03533%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A34%3A19Z%22%7D%7D%2C%7B%22key%22%3A%227J2TXPJV%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Li%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ELi%2C%20X.%2C%20Dai%2C%20Y.%20%26amp%3B%20Qu%2C%20Q.%20Understanding%20Generalizability%20of%20Diffusion%20Models%20Requires%20Rethinking%20the%20Hidden%20Gaussian%20Structure.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.24060%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.24060%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Understanding%20Generalizability%20of%20Diffusion%20Models%20Requires%20Rethinking%20the%20Hidden%20Gaussian%20Structure%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xiang%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yixiang%22%2C%22lastName%22%3A%22Dai%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Qing%22%2C%22lastName%22%3A%22Qu%22%7D%5D%2C%22abstractNote%22%3A%22In%20this%20work%2C%20we%20study%20the%20generalizability%20of%20diffusion%20models%20by%20looking%20into%20the%20hidden%20properties%20of%20the%20learned%20score%20functions%2C%20which%20are%20essentially%20a%20series%20of%20deep%20denoisers%20trained%20on%20various%20noise%20levels.%20We%20observe%20that%20as%20diffusion%20models%20transition%20from%20memorization%20to%20generalization%2C%20their%20corresponding%20nonlinear%20diffusion%20denoisers%20exhibit%20increasing%20linearity.%20This%20discovery%20leads%20us%20to%20investigate%20the%20linear%20counterparts%20of%20the%20nonlinear%20diffusion%20models%2C%20which%20are%20a%20series%20of%20linear%20models%20trained%20to%20match%20the%20function%20mappings%20of%20the%20nonlinear%20diffusion%20denoisers.%20Surprisingly%2C%20these%20linear%20denoisers%20are%20approximately%20the%20optimal%20denoisers%20for%20a%20multivariate%20Gaussian%20distribution%20characterized%20by%20the%20empirical%20mean%20and%20covariance%20of%20the%20training%20dataset.%20This%20finding%20implies%20that%20diffusion%20models%20have%20the%20inductive%20bias%20towards%20capturing%20and%20utilizing%20the%20Gaussian%20structure%20%28covariance%20information%29%20of%20the%20training%20dataset%20for%20data%20generation.%20We%20empirically%20demonstrate%20that%20this%20inductive%20bias%20is%20a%20unique%20property%20of%20diffusion%20models%20in%20the%20generalization%20regime%2C%20which%20becomes%20increasingly%20evident%20when%20the%20model%27s%20capacity%20is%20relatively%20small%20compared%20to%20the%20training%20dataset%20size.%20In%20the%20case%20that%20the%20model%20is%20highly%20overparameterized%2C%20this%20inductive%20bias%20emerges%20during%20the%20initial%20training%20phases%20before%20the%20model%20fully%20memorizes%20its%20training%20data.%20Our%20study%20provides%20crucial%20insights%20into%20understanding%20the%20notable%20strong%20generalization%20phenomenon%20recently%20observed%20in%20real-world%20diffusion%20models.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2410.24060%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2410.24060%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A31%3A40Z%22%7D%7D%2C%7B%22key%22%3A%22ZKGC9N8B%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kazemi%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EKazemi%2C%20A.%2C%20Fatima%2C%20Q.%20ul%20ain%2C%20Kindratenko%2C%20V.%20%26amp%3B%20Tessum%2C%20C.%20AIDOVECL%3A%20AI-generated%20Dataset%20of%20Outpainted%20Vehicles%20for%20Eye-level%20Classification%20and%20Localization.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.24116%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.24116%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22AIDOVECL%3A%20AI-generated%20Dataset%20of%20Outpainted%20Vehicles%20for%20Eye-level%20Classification%20and%20Localization%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amir%22%2C%22lastName%22%3A%22Kazemi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Qurat%20ul%20ain%22%2C%22lastName%22%3A%22Fatima%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Volodymyr%22%2C%22lastName%22%3A%22Kindratenko%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christopher%22%2C%22lastName%22%3A%22Tessum%22%7D%5D%2C%22abstractNote%22%3A%22Image%20labeling%20is%20a%20critical%20bottleneck%20in%20the%20development%20of%20computer%20vision%20technologies%2C%20often%20constraining%20the%20potential%20of%20machine%20learning%20models%20due%20to%20the%20time-intensive%20nature%20of%20manual%20annotations.%20This%20work%20introduces%20a%20novel%20approach%20that%20leverages%20outpainting%20to%20address%20the%20problem%20of%20annotated%20data%20scarcity%20by%20generating%20artificial%20contexts%20and%20annotations%2C%20significantly%20reducing%20manual%20labeling%20efforts.%20We%20apply%20this%20technique%20to%20a%20particularly%20acute%20challenge%20in%20autonomous%20driving%2C%20urban%20planning%2C%20and%20environmental%20monitoring%3A%20the%20lack%20of%20diverse%2C%20eye-level%20vehicle%20images%20in%20desired%20classes.%20Our%20dataset%20comprises%20AI-generated%20vehicle%20images%20obtained%20by%20detecting%20and%20cropping%20vehicles%20from%20manually%20selected%20seed%20images%2C%20which%20are%20then%20outpainted%20onto%20larger%20canvases%20to%20simulate%20varied%20real-world%20conditions.%20The%20outpainted%20images%20include%20detailed%20annotations%2C%20providing%20high-quality%20ground%20truth%20data.%20Advanced%20outpainting%20techniques%20and%20image%20quality%20assessments%20ensure%20visual%20fidelity%20and%20contextual%20relevance.%20Augmentation%20with%20outpainted%20vehicles%20improves%20overall%20performance%20metrics%20by%20up%20to%208%5C%5C%25%20and%20enhances%20prediction%20of%20underrepresented%20classes%20by%20up%20to%2020%5C%5C%25.%20This%20approach%2C%20exemplifying%20outpainting%20as%20a%20self-annotating%20paradigm%2C%20presents%20a%20solution%20that%20enhances%20dataset%20versatility%20across%20multiple%20domains%20of%20machine%20learning.%20The%20code%20and%20links%20to%20datasets%20used%20in%20this%20study%20are%20available%20for%20further%20research%20and%20replication%20at%20https%3A%5C%2F%5C%2Fgithub.com%5C%2Famir-kazemi%5C%2Faidovecl.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2410.24116%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2410.24116%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A30%3A22Z%22%7D%7D%2C%7B%22key%22%3A%226TL58AB2%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Liu%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3ELiu%2C%20Q.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Univariate%20Conditional%20Variational%20Autoencoder%20for%20Morphogenic%20Patterns%20Design%20in%20Frontal%20Polymerization-Based%20Manufacturing.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.17518%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.17518%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Univariate%20Conditional%20Variational%20Autoencoder%20for%20Morphogenic%20Patterns%20Design%20in%20Frontal%20Polymerization-Based%20Manufacturing%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Qibang%22%2C%22lastName%22%3A%22Liu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pengfei%22%2C%22lastName%22%3A%22Cai%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Diab%22%2C%22lastName%22%3A%22Abueidda%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sagar%22%2C%22lastName%22%3A%22Vyas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Seid%22%2C%22lastName%22%3A%22Koric%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rafael%22%2C%22lastName%22%3A%22Gomez-Bombarelli%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Philippe%22%2C%22lastName%22%3A%22Geubelle%22%7D%5D%2C%22abstractNote%22%3A%22Under%20some%20initial%20and%20boundary%20conditions%2C%20the%20rapid%20reaction-thermal%20diffusion%20process%20taking%20place%20during%20frontal%20polymerization%20%28FP%29%20destabilizes%20the%20planar%20mode%20of%20front%20propagation%2C%20leading%20to%20spatially%20varying%2C%20complex%20hierarchical%20patterns%20in%20thermoset%20polymeric%20materials.%20Although%20modern%20reaction-diffusion%20models%20can%20predict%20the%20patterns%20resulting%20from%20unstable%20FP%2C%20the%20inverse%20design%20of%20patterns%2C%20which%20aims%20to%20retrieve%20process%20conditions%20that%20produce%20a%20desired%20pattern%2C%20remains%20an%20open%20challenge%20due%20to%20the%20non-unique%20and%20non-intuitive%20mapping%20between%20process%20conditions%20and%20manufactured%20patterns.%20In%20this%20work%2C%20we%20propose%20a%20probabilistic%20generative%20model%20named%20univariate%20conditional%20variational%20autoencoder%20%28UcVAE%29%20for%20the%20inverse%20design%20of%20hierarchical%20patterns%20in%20FP-based%20manufacturing.%20Unlike%20the%20cVAE%2C%20which%20encodes%20both%20the%20design%20space%20and%20the%20design%20target%2C%20the%20UcVAE%20encodes%20only%20the%20design%20space.%20In%20the%20encoder%20of%20the%20UcVAE%2C%20the%20number%20of%20training%20parameters%20is%20significantly%20reduced%20compared%20to%20the%20cVAE%2C%20resulting%20in%20a%20shorter%20training%20time%20while%20maintaining%20comparable%20performance.%20Given%20desired%20pattern%20images%2C%20the%20trained%20UcVAE%20can%20generate%20multiple%20process%20condition%20solutions%20that%20produce%20high-fidelity%20hierarchical%20patterns.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2410.17518%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2410.17518%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A18%3A19Z%22%7D%7D%2C%7B%22key%22%3A%22AI2YSVUV%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hossain%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EHossain%2C%20R.%20B.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Virtual%20Sensing%20for%20Real-Time%20Degradation%20Monitoring%20of%20Nuclear%20Systems%3A%20Leveraging%20DeepONet%20for%20Enhanced%20Sensing%20Coverage%20for%20Digital%20Twin-Enabling%20Technology.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.13762%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.13762%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Virtual%20Sensing%20for%20Real-Time%20Degradation%20Monitoring%20of%20Nuclear%20Systems%3A%20Leveraging%20DeepONet%20for%20Enhanced%20Sensing%20Coverage%20for%20Digital%20Twin-Enabling%20Technology%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Raisa%20Bentay%22%2C%22lastName%22%3A%22Hossain%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Farid%22%2C%22lastName%22%3A%22Ahmed%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kazuma%22%2C%22lastName%22%3A%22Kobayashi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Seid%22%2C%22lastName%22%3A%22Koric%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Diab%22%2C%22lastName%22%3A%22Abueidda%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Syed%20Bahauddin%22%2C%22lastName%22%3A%22Alam%22%7D%5D%2C%22abstractNote%22%3A%22Effective%20real-time%20monitoring%20technique%20is%20crucial%20for%20detecting%20material%20degradation%20and%20maintaining%20the%20structural%20integrity%20of%20nuclear%20systems%20to%20ensure%20both%20safety%20and%20operational%20efficiency.%20Traditional%20physical%20sensor%20systems%20face%20limitations%20such%20as%20installation%20challenges%2C%20high%20costs%2C%20and%20difficulties%20in%20measuring%20critical%20parameters%20in%20hard-to-reach%20or%20harsh%20environments%2C%20often%20resulting%20in%20incomplete%20data%20coverage.%20Machine%20learning-driven%20virtual%20sensors%20offer%20a%20promising%20solution%20by%20enhancing%20physical%20sensor%20capabilities%20to%20monitor%20critical%20degradation%20indicators%20like%20pressure%2C%20velocity%2C%20and%20turbulence.%20However%2C%20conventional%20machine%20learning%20models%20struggle%20with%20real-time%20monitoring%20due%20to%20the%20high-dimensional%20nature%20of%20reactor%20data%20and%20the%20need%20for%20frequent%20retraining.%20This%20paper%20explores%20the%20use%20of%20Deep%20Operator%20Networks%20%28DeepONet%29%20within%20a%20digital%20twin%20%28DT%29%20framework%20to%20predict%20key%20thermal-hydraulic%20parameters%20in%20the%20hot%20leg%20of%20an%20AP-1000%20Pressurized%20Water%20Reactor%20%28PWR%29.%20In%20this%20study%2C%20DeepONet%20is%20trained%20with%20different%20operational%20conditions%2C%20which%20relaxes%20the%20requirement%20of%20continuous%20retraining%2C%20making%20it%20suitable%20for%20online%20and%20real-time%20prediction%20components%20for%20DT.%20Our%20results%20show%20that%20DeepONet%20achieves%20accurate%20predictions%20with%20low%20mean%20squared%20error%20and%20relative%20L2%20error%20and%20can%20make%20predictions%20on%20unknown%20data%20160%2C000%20times%20faster%20than%20traditional%20finite%20element%20%28FE%29%20simulations.%20This%20speed%20and%20accuracy%20make%20DeepONet%20a%20powerful%20tool%20for%20tracking%20conditions%20that%20contribute%20to%20material%20degradation%20in%20real-time%2C%20enhancing%20reactor%20safety%20and%20longevity.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2410.13762%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2410.13762%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A11%3A06Z%22%7D%7D%2C%7B%22key%22%3A%22ZUIDJ8D4%22%2C%22library%22%3A%7B%22id%22%3A5005740%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chen%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%20style%3D%5C%22clear%3A%20left%3B%20%5C%22%3E%5Cn%20%20%20%20%3Cdiv%20class%3D%5C%22csl-left-margin%5C%22%20style%3D%5C%22float%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%5C%22%3E1.%3C%5C%2Fdiv%3E%3Cdiv%20class%3D%5C%22csl-right-inline%5C%22%20style%3D%5C%22margin%3A%200%20.4em%200%201.5em%3B%5C%22%3EChen%2C%20Z.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20Retrospective%20Learning%20from%20Interactions.%20Preprint%20at%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.13852%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.48550%5C%2FARXIV.2410.13852%3C%5C%2Fa%3E%20%282024%29.%3C%5C%2Fdiv%3E%5Cn%20%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Retrospective%20Learning%20from%20Interactions%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Zizhao%22%2C%22lastName%22%3A%22Chen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mustafa%20Omer%22%2C%22lastName%22%3A%22Gul%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yiwei%22%2C%22lastName%22%3A%22Chen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gloria%22%2C%22lastName%22%3A%22Geng%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anne%22%2C%22lastName%22%3A%22Wu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yoav%22%2C%22lastName%22%3A%22Artzi%22%7D%5D%2C%22abstractNote%22%3A%22Multi-turn%20interactions%20between%20large%20language%20models%20%28LLMs%29%20and%20users%20naturally%20include%20implicit%20feedback%20signals.%20If%20an%20LLM%20responds%20in%20an%20unexpected%20way%20to%20an%20instruction%2C%20the%20user%20is%20likely%20to%20signal%20it%20by%20rephrasing%20the%20request%2C%20expressing%20frustration%2C%20or%20pivoting%20to%20an%20alternative%20task.%20Such%20signals%20are%20task-independent%20and%20occupy%20a%20relatively%20constrained%20subspace%20of%20language%2C%20allowing%20the%20LLM%20to%20identify%20them%20even%20if%20it%20fails%20on%20the%20actual%20task.%20This%20creates%20an%20avenue%20for%20continually%20learning%20from%20interactions%20without%20additional%20annotations.%20We%20introduce%20ReSpect%2C%20a%20method%20to%20learn%20from%20such%20signals%20in%20past%20interactions%20via%20retrospection.%20We%20deploy%20ReSpect%20in%20a%20new%20multimodal%20interaction%20scenario%2C%20where%20humans%20instruct%20an%20LLM%20to%20solve%20an%20abstract%20reasoning%20task%20with%20a%20combinatorial%20solution%20space.%20Through%20thousands%20of%20interactions%20with%20humans%2C%20we%20show%20how%20ReSpect%20gradually%20improves%20task%20completion%20rate%20from%2031%25%20to%2082%25%2C%20all%20without%20any%20external%20annotation.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22arXiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222024%22%2C%22DOI%22%3A%2210.48550%5C%2FARXIV.2410.13852%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F2410.13852%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22XHSH9DGT%22%5D%2C%22dateModified%22%3A%222024-11-21T17%3A09%3A40Z%22%7D%7D%5D%7D
1.
Richards, C., Dima, A., Ferguson, D. & Witek, H. Growing black-hole hair in nonminimally coupled biscalar gravity. Preprint at https://doi.org/10.48550/ARXIV.2501.14034 (2025).
1.
Osorio, J. et al. Keep it Local: Comparing Domain-Specific LLMs in Native and Machine Translated Text using Parallel Corpora on Political Conflict. in 2024 2nd International Conference on Foundation and Large Language Models (FLLM) 542–552 (IEEE, Dubai, United Arab Emirates, 2024). http://doi.org/10.1109/FLLM63129.2024.10852489.
1.
Avdiunina, P., Jamal, S., Gusev, F. & Isayev, O. All that glitters is not gold: Importance of rigorous evaluation of proteochemometric models. Preprint at https://doi.org/10.26434/chemrxiv-2025-vbmgc (2025).
1.
Pilny, A., Bonito, J. & Schecter, A. Coding Small Group Communication with AI: RNNs and Transformers with Context. Small Group Research 10464964251314196 (2025) http://doi.org/10.1177/10464964251314197.
1.
Deng, J. et al. $\texttt{dattri}$: A Library for Efficient Data Attribution. Preprint at https://doi.org/10.48550/ARXIV.2410.04555 (2024).
1.
Chen, W., Yan, B., Chen, C.-C. & Watanabe, S. Floras 50: A Massively Multilingual Multitask Benchmark for Long-Form Conversational Speech. in 2024 IEEE Spoken Language Technology Workshop (SLT) 891–898 (IEEE, Macao, 2024). http://doi.org/10.1109/SLT61566.2024.10832167.
1.
Nakamura, T. et al. Discrete Speech Unit Extraction via Independent Component Analysis. Preprint at https://doi.org/10.48550/ARXIV.2501.06562 (2025).
1.
Khot, A., Wang, X., Roy, A., Kindratenko, V. & Neubauer, M. S. Evidential Deep Learning for Uncertainty Quantification and Out-of-Distribution Detection in Jet Identification using Deep Neural Networks. Preprint at https://doi.org/10.48550/ARXIV.2501.05656 (2025).
1.
Andrews, J., Weirich, K. & Schiller, U. D. Molecular-Scale Simulation of Wetting of Actin Filaments by Protein Droplets. J. Phys. Chem. B 129, 1109–1121 (2025).
1.
Wang, S. et al. Deep CNN-based semi-supervised learning approach for identifying and segmenting corrosion in hydraulic steel and water resources infrastructure. Structural Health Monitoring 14759217241305040 (2025) http://doi.org/10.1177/14759217241305039.
1.
Feng, J. T., Satheesan, S. P., Kong, S., Donders, T. H. & Punyasena, S. W. Addressing the open world: detecting and segmenting pollen on palynological slides with deep learning. Preprint at https://doi.org/10.1101/2025.01.05.631390 (2025).
1.
Vatansever, D. & Levin, D. Collisionless Plasma Plume Expansion Under External Magnetic Fields. (2025).
1.
Wu, Y. et al. Enhancing Audiovisual Speech Recognition through Bifocal Preference Optimization. Preprint at https://doi.org/10.48550/ARXIV.2412.19005 (2024).
1.
Imam, I. A. et al. Integrating Protein Language Model and Molecular Dynamics Simulations to Discover Antibiofouling Peptides. Langmuir 41, 811–821 (2025).
1.
Kobayashi, K. & Alam, S. B. Physics-regularized neural networks for predictive modeling of silicon carbide swelling with limited experimental data. Sci Rep 14, 30666 (2024).
1.
Hassan, U., Zhu, J., Chen, D. & Cheung, S.-C. S. DPGEM: Differentially Private Generative Model with Exponential Mechanism. in 2024 IEEE International Workshop on Information Forensics and Security (WIFS) 1–6 (IEEE, Rome, Italy, 2024). http://doi.org/10.1109/WIFS61860.2024.10810705.
1.
Padmanabha, G. A., Safta, C., Bouklas, N. & Jones, R. E. Condensed Stein Variational Gradient Descent for Uncertainty Quantification of Neural Networks. Preprint at https://doi.org/10.48550/ARXIV.2412.16462 (2024).
1.
Brandt, P. T. et al. ConfliBERT: A Language Model for Political Conflict. Preprint at https://doi.org/10.48550/ARXIV.2412.15060 (2024).
1.
Modesitt, E., Yang, K., Hulsey, S., Zhai, C. & Kindratenko, V. ORBIT: Cost-Effective Dataset Curation for Large Language Model Domain Adaptation with an Astronomy Case Study. Preprint at https://doi.org/10.48550/ARXIV.2412.14436 (2024).
1.
Xu, Z., Yan, J., Gupta, A. & Srikumar, V. State Space Models are Strong Text Rerankers. Preprint at https://doi.org/10.48550/ARXIV.2412.14354 (2024).
1.
Fukami, K. & Taira, K. Single-snapshot machine learning for super-resolution of turbulence. (2024) http://doi.org/10.48550/ARXIV.2409.04923.
1.
Kacmaz, S., Haas, R. & Huerta, E. A. Machine learning-driven conservative-to-primitive conversion in hybrid piecewise polytropic and tabulated equations of state. Preprint at https://doi.org/10.48550/ARXIV.2412.07836 (2024).
1.
Mark, M. S. et al. Policy Agnostic RL: Offline RL and Online RL Fine-Tuning of Any Class and Backbone. Preprint at https://doi.org/10.48550/ARXIV.2412.06685 (2024).
1.
Kim, Y., Most, E. R., Beloborodov, A. M. & Ripperda, B. Black hole pulsars and monster shocks as outcomes of black hole-neutron star mergers. Preprint at https://doi.org/10.48550/ARXIV.2412.05760 (2024).
1.
Chen, P. et al. Learning a Filtered Backprojection Reconstruction Method for Photoacoustic Computed Tomography with Hemispherical Measurement Geometries. Preprint at https://doi.org/10.48550/ARXIV.2412.01971 (2024).
1.
Kobayashi, K., Ahmed, F. & Alam, S. B. Virtual Sensing to Enable Real-Time Monitoring of Inaccessible Locations \& Unmeasurable Parameters. Preprint at https://doi.org/10.48550/ARXIV.2412.00107 (2024).
1.
Prakash, A., Chang, M., Jin, M., Tu, R. & Gupta, S. 3D Reconstruction of Objects in Hands without Real World 3D Supervision. Preprint at https://doi.org/10.48550/ARXIV.2305.03036 (2023).
1.
Abbasi, S. & Mehdizadeh, A. On the interplay between fluid flow characteristics and small particle deposition in turbulent wall bounded flows. Physics of Fluids 36, 113305 (2024).
1.
You, D. et al. Inverse design of short-range order arrangement via neural network. International Journal of Solids and Structures 113175 (2024) http://doi.org/10.1016/j.ijsolstr.2024.113175.
1.
Sharma, A., Ding, H., Li, J., Dani, N. & Zhang, M. MiniKV: Pushing the Limits of LLM Inference via 2-Bit Layer-Discriminative KV Cache. Preprint at https://doi.org/10.48550/ARXIV.2411.18077 (2024).
1.
Rao, R., Chandrasekar, K. & Kale, L. An Adaptive Asynchronous Approach for the Single-Source Shortest Paths Problem. in IA^3 2024 - 14th Workshop on Irregular Applications: Architectures & Algorithms (Atlanta, Georgia, 2024). http://doi.org/10.1109/SCW63240.2024.00097.
1.
Abedsoltan, A., Ma, S., Pandit, P. & Belkin, M. Fast training of large kernel models with delayed projections. Preprint at https://doi.org/10.48550/ARXIV.2411.16658 (2024).
1.
Chau, T. N., Wang, X., McDowell, J. M. & Li, S. Advancing plant single-cell genomics with foundation models. Current Opinion in Plant Biology 82, 102666 (2024).
1.
Kim, Y.-J., Waegel, A., Hakkarainen, M., Yi, Y. K. & Braham, W. Understanding HVAC system runtime of U.S. homes: An energy signature analysis using smart thermostat data. Build. Simul. (2024) http://doi.org/10.1007/s12273-024-1203-9.
1.
Dhruv, V., Prather, B., Wong, G. & Gammie, C. F. A Survey of General Relativistic Magnetohydrodynamic Models for Black Hole Accretion Systems. Preprint at https://doi.org/10.48550/ARXIV.2411.12647 (2024).
1.
Kasirajan, V., Battelle, T. & Wold, B. Empowering Large Scale Quantum Circuit Development: Effective Simulation of Sycamore Circuits. Preprint at https://doi.org/10.48550/ARXIV.2411.12131 (2024).
1.
Griebel, S. et al. Locating the Leading Edge of cultural Change. in (Aarhus, Denmark, 2024).
1.
Liu, Z. et al. Accurate Ring Strain Energy Predictions with Machine Learning and Application in Strain-Promoted Reactions. Preprint at https://doi.org/10.26434/chemrxiv-2024-dtq6q (2024).
1.
Schmaltz, T., Hu, Y. & Lazarian, A. Estimate Sonic Mach Number in the Interstellar Medium with Convolutional Neural Network. Preprint at http://arxiv.org/abs/2411.11157 (2024).
1.
Roze, L. V. et al. Increasing thermostability of the key photorespiratory enzyme glycerate 3‐kinase by structure‐based recombination. Plant Biotechnology Journal pbi.14508 (2024) http://doi.org/10.1111/pbi.14508.
1.
Liu, H. et al. Time-MMD: Multi-Domain Multimodal Dataset for Time Series Analysis. Preprint at https://doi.org/10.48550/ARXIV.2406.08627 (2024).
1.
Liu, H., Liu, C. & Prakash, B. A. A Picture is Worth A Thousand Numbers: Enabling LLMs Reason about Time Series via Visualization. Preprint at https://doi.org/10.48550/ARXIV.2411.06018 (2024).
1.
Yang, B. et al. Engineering the Mechanical Stability of a Therapeutic Complex between Affibody and Programmed Death-Ligand 1 by Anchor Point Selection. ACS Nano 18, 31912–31922 (2024).
1.
Nguyen, T.-D., Zhang, C., Gitbumrungsin, M., Raheja, A. & Chen, T. Remote Kinematic Analysis for Mobility Scooter Riders Leveraging Edge AI. AAAI-SS 4, 314–318 (2024).
1.
Chandrasekar, K. & Kale, L. Shared Memory-Aware Latency-Sensitive Message Aggregation for Fine-Grained Communication. Preprint at https://doi.org/10.48550/ARXIV.2411.03533 (2024).
1.
Li, X., Dai, Y. & Qu, Q. Understanding Generalizability of Diffusion Models Requires Rethinking the Hidden Gaussian Structure. Preprint at https://doi.org/10.48550/ARXIV.2410.24060 (2024).
1.
Kazemi, A., Fatima, Q. ul ain, Kindratenko, V. & Tessum, C. AIDOVECL: AI-generated Dataset of Outpainted Vehicles for Eye-level Classification and Localization. Preprint at https://doi.org/10.48550/ARXIV.2410.24116 (2024).
1.
Liu, Q. et al. Univariate Conditional Variational Autoencoder for Morphogenic Patterns Design in Frontal Polymerization-Based Manufacturing. Preprint at https://doi.org/10.48550/ARXIV.2410.17518 (2024).
1.
Hossain, R. B. et al. Virtual Sensing for Real-Time Degradation Monitoring of Nuclear Systems: Leveraging DeepONet for Enhanced Sensing Coverage for Digital Twin-Enabling Technology. Preprint at https://doi.org/10.48550/ARXIV.2410.13762 (2024).
1.
Chen, Z. et al. Retrospective Learning from Interactions. Preprint at https://doi.org/10.48550/ARXIV.2410.13852 (2024).