How should we build methods and tools to prepare for future pandemics?

There are several models that we could take to develop methods and tools in order to prepare for future pandemics. I’ve made a short poll/conversation to talk about what these are. Previous projects I have worked on with similar aims have really failed due to issues with community and resources rather than any technical hurdles.

I see the epinowcast community as one way of going about this and I think it is worth thinking about how it is structured to make it a success.

Also very keen to have a conservation here about what people think (and it may end up being a better platform than this poll as its quite a complex topic).

The poll is here:

1 Like

One question that I can’t seem to formulate correctly and it’s the decision theory component and how when/it meshes with the modelling. Maybe it’s a derivative of adequately communicating results with appropriate uncertainty and needs it’s own framework. There’s an apocryphal saying by Lyndon Johnson that “ranges are for cowboys; give me a number”. At some points I was presenting on proportions of forecasts beyond certain breakpoints that were tied to certain actions. Maybe this idea isn’t complete, but the intersection of the modelling and the action needs improvement (and semi-related to your 80% models)…

First post! :partying_face:

Yes, I think this is a really good point. We were actually just having this discussion in a group meeting looking back at our pandemic work. Struggled to even frame the question about how to connect decisions and these kind of methods.

For others I think the 80% piece your talking about is this one: Sam Abbott: What is 80% good enough for real-time infectious disease analyses?

For interest I also just wrote a bit of framing around this: Sam Abbott: How should we build methods and tools to prepare for future pandemics?

Lastly for people interested in the results (it is so fun!) they are here:

1 Like