A seminar by Professor Xiu Dacheng from University of Chicago
Title: Expected Returns and Large Language Models
Abstract: We extract contextualized representations of news text to predict returns using the state-of-the-art large language models in natural language processing. Unlike the traditional word-based methods, e.g., bag-of-words or word vectors, the contextualized representation captures both the syntax and semantics of text, thus providing a more comprehensive understanding of its meaning. Notably, word-based approaches are more susceptible to errors when negation words are present in news articles. Our study includes data from 16 international equity markets and news articles in 13 different languages, providing polyglot evidence of news-induced return predictability. We observe that information in newswires is incorporated into prices with an inefficient delay that aligns with the limits-to-arbitrage, yet can still be exploited in real-time trading strategies. Additionally, we find that a trading strategy that capitalizes on fresh news alerts results in even higher Sharpe ratios.
For further information, please contact RSFAS Seminars.
All information collected by the University is governed by the ANU Privacy Policy.