SSI epinowcast governance application

@sbfnk and I have written a letter of intent for the sustainable software institute open source software call (Research Software Maintenance Fund | Software Sustainability Institute)

Provisionary title: Unifying, harmonising, and establishing sustainable governance for the epinowcast/epiforecasts ecosystem of infectious disease modelling tools

The idea as maybe implied by the title is to link up the work we have been doing in a new organisation with more formal governance, make new tooling like shared preprocessing and start working towards more formal onboarding and expansion. The application is here Review application · Issue #11 · epiforecasts/ssi-2025-grant · GitHub and any thoughts welcome here or in the repository. Deadline for submission is the 30th but will submit a few days before this. This is just the first round so more feedback opportunities if we get through this.

This is all connected to this Rebranding the epinowcast community and organisation - #6 by kcharniga2

I am also very interested if anyone has any other ideas for funding these efforts and would be keen to lead initiatives!

Also apologies for our insane team composition statement. Sadly grant writing feels like formalised boasting.

2 Likes

Last chance for input folks this goes in tomorrow.

This got judged as in scope by SSI so I now need to submit a full application by mid September. Let me know if anyone has any thoughts.

1 Like

The full version of this is due on Thursday - my current draft is here: ssi-2025-grant/full-application/application.md at b4998acb6220ae5a02ddebb11e2ec150952c901b · epiforecasts/ssi-2025-grant · GitHub

Feedback welcome!

The last round scored 100% so trying to stick relatively close in tone and content to that (though its not clear to me if the scoring metrics will change given this round requires a very different format etc.)

1 Like

This got turned down unfortunately. Review comments due soon so will circle back. Depending on the review comments I will likely have a go at the next round which is in December. As ever any thoughts very welcome.

1 Like

The overview of all the submissions is now up: Research Software Maintenance Fund: full application funding panels completed | Software Sustainability Institute

It looks like they somewhat regret the process from reading this. In general grants submitted scored very highly i.e average of 4.8 or very good should be funded (they made ad-hoc changes to their scale to add more good categories after submission).

They ended up funding 13 grants from 143 submissions (I made a little calculator to look at value add of grants see Grant application break-even analysis calculator · GitHub - I think this grant was just above break even i.e just beneficial on an ecosystem level).

In their wrap up they had:

For this first round, we wanted to be deliberately broad, to understand the types of activity that the community felt needed funding.

I think this is a really disappointing approach from them and something I would like grant funders not to do. A full grant application shouldn’t be part of your market research as it is such a large time sink for everyone in the ecosystem that isn’t you - especially in such poorly funded things like OS software.

How did we do?

We scored 4/6 so very good on their scale but well below average and not close to being funded. There wasn’t much specific feedback on why and what there is doesn’t seem to really align with the scores. The main takeaway is already have the staff in place.

I think my main learning point from applying for this and the CZ essential software grants is that there may not be that much point applying for these generalist software grants unless infectious diseases are really in the zeitgeist. That is somewhat problematic in that there aren’t domain specific options of any kind that I am aware of. However, I think given the feedback this had effectively had no chance regardless of content.

That being said I am aware the people at Imperial applied for this as well so perhaps they did better. If anyone knows it would be interesting to compare notes.

Detailed reviewer comments:

Loses a point for limited quantification of maintenance outcomes (e.g., explicit release cadence, CI/coverage targets) and limited articulation of how deprecation/back-compat will be communicated and enforced

There was no clear outline of success metrics.

Weaknesses: There was no real evidence of contribution to the wider software ecosystem.

Weaknesses: The risk of being unable to recruit a suitable RSE is a significant risk.

We got our lowest score in this category and this was the only negative listed.

Marks off for currently informal governance (future state is promised but not yet defined), and for lack of concrete policies upfront (e.g., deprecation windows, versioning/release policy, minimum CI coverage %, bus-factor reduction plans).

Weaknesses: Establishing formal governance structures is not insignificant, as highlighted by being given its own work package. However the number of work packages does concern me

Overall comments

Reviewer 1:

Strengths: High-impact, widely used tools thoughtful plan to reduce duplication and improve interfaces; credible team with prior ecosystem successes; strong community-building and EDIA orientation; metrics-aware management approach.

Weaknesses: Success hinges on hiring a specialist RSE and securing community uptake of new governance; several process details (release/deprecation policy, explicit coverage targets, concrete cross-project agreements) are not yet locked in; some outcomes would benefit from clearer, measurable acceptance criteria given the sizeable

Reviewer 2:

The objectives for the software including creating a community around it seem ideal. However I feel the time has it included more goals then are achievable within the timescale.

Reviewer 3:

Overall this is a robust proposal and one that is needed so that this type of software can evolve and assist researchers in an area that is in constant change.