Discussion about this post

User's avatar
Doug Edmonds's avatar

Given that the current LLMs are trained on an internet sized volume of text, the same would possibly be true for a "transformer-timeseries" model...echoing your point regarding sufficient recessionary events to model.

Expand full comment
Chris Napoli's avatar

Informative read, Alex. Keep up the great work.

Expand full comment

No posts

Ready for more?